Sep 30 17:42:28 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 17:42:28 crc restorecon[4675]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:28 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:42:29 crc restorecon[4675]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 17:42:29 crc kubenswrapper[4797]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:42:29 crc kubenswrapper[4797]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 17:42:29 crc kubenswrapper[4797]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:42:29 crc kubenswrapper[4797]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:42:29 crc kubenswrapper[4797]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 17:42:29 crc kubenswrapper[4797]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.922682 4797 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932247 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932283 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932292 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932301 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932309 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932320 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932329 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932339 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932348 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932358 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932366 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932375 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932382 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932393 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932404 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932412 4797 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932421 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932476 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932485 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932492 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932500 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932509 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932517 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932524 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932532 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932540 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932548 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932557 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932565 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932572 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932580 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932587 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932595 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932604 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932612 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932619 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932627 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932635 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932642 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932650 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932658 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932665 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932673 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932683 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932718 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932729 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932751 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932760 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932769 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932777 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932785 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932794 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932802 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932810 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932817 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932825 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932832 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932843 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932852 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932861 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932869 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932877 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932886 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932894 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932901 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932908 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932924 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932932 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932939 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932961 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.932971 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933110 4797 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933126 4797 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933626 4797 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933639 4797 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933650 4797 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933660 4797 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933672 4797 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933683 4797 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933693 4797 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933702 4797 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933712 4797 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933721 4797 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933731 4797 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933739 4797 flags.go:64] FLAG: --cgroup-root="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933748 4797 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933757 4797 flags.go:64] FLAG: --client-ca-file="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933766 4797 flags.go:64] FLAG: --cloud-config="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933774 4797 flags.go:64] FLAG: --cloud-provider="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933783 4797 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933802 4797 flags.go:64] FLAG: --cluster-domain="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933811 4797 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933820 4797 flags.go:64] FLAG: --config-dir="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933829 4797 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933838 4797 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933849 4797 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933858 4797 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933867 4797 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933877 4797 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933885 4797 flags.go:64] FLAG: --contention-profiling="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933894 4797 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933903 4797 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933912 4797 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933921 4797 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933933 4797 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933952 4797 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933962 4797 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933971 4797 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933980 4797 flags.go:64] FLAG: --enable-server="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.933988 4797 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934004 4797 flags.go:64] FLAG: --event-burst="100" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934014 4797 flags.go:64] FLAG: --event-qps="50" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934023 4797 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934032 4797 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934041 4797 flags.go:64] FLAG: --eviction-hard="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934052 4797 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934062 4797 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934072 4797 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934082 4797 flags.go:64] FLAG: --eviction-soft="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934091 4797 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934101 4797 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934111 4797 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934120 4797 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934133 4797 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934141 4797 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934150 4797 flags.go:64] FLAG: --feature-gates="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934161 4797 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934170 4797 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934179 4797 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934188 4797 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934197 4797 flags.go:64] FLAG: --healthz-port="10248" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934206 4797 flags.go:64] FLAG: --help="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934215 4797 flags.go:64] FLAG: --hostname-override="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934224 4797 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934233 4797 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934241 4797 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934250 4797 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934259 4797 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934268 4797 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934276 4797 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934285 4797 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934304 4797 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934314 4797 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934323 4797 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934332 4797 flags.go:64] FLAG: --kube-reserved="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934341 4797 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934350 4797 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934359 4797 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934368 4797 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934377 4797 flags.go:64] FLAG: --lock-file="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934385 4797 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934394 4797 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934404 4797 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934417 4797 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934426 4797 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934465 4797 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934474 4797 flags.go:64] FLAG: --logging-format="text" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934483 4797 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934493 4797 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934502 4797 flags.go:64] FLAG: --manifest-url="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934511 4797 flags.go:64] FLAG: --manifest-url-header="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934522 4797 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934531 4797 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934542 4797 flags.go:64] FLAG: --max-pods="110" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934550 4797 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934559 4797 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934568 4797 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934577 4797 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934586 4797 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934595 4797 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934604 4797 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934634 4797 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934644 4797 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934653 4797 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934662 4797 flags.go:64] FLAG: --pod-cidr="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934670 4797 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934682 4797 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934701 4797 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934711 4797 flags.go:64] FLAG: --pods-per-core="0" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934720 4797 flags.go:64] FLAG: --port="10250" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934728 4797 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934737 4797 flags.go:64] FLAG: --provider-id="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934746 4797 flags.go:64] FLAG: --qos-reserved="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934755 4797 flags.go:64] FLAG: --read-only-port="10255" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934763 4797 flags.go:64] FLAG: --register-node="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934772 4797 flags.go:64] FLAG: --register-schedulable="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934781 4797 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934799 4797 flags.go:64] FLAG: --registry-burst="10" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934808 4797 flags.go:64] FLAG: --registry-qps="5" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934817 4797 flags.go:64] FLAG: --reserved-cpus="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934826 4797 flags.go:64] FLAG: --reserved-memory="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934836 4797 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934845 4797 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934854 4797 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934863 4797 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934872 4797 flags.go:64] FLAG: --runonce="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934891 4797 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934901 4797 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934910 4797 flags.go:64] FLAG: --seccomp-default="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934919 4797 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934928 4797 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934937 4797 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934946 4797 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934954 4797 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934964 4797 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934972 4797 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934981 4797 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934989 4797 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.934998 4797 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935007 4797 flags.go:64] FLAG: --system-cgroups="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935016 4797 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935029 4797 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935037 4797 flags.go:64] FLAG: --tls-cert-file="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935057 4797 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935078 4797 flags.go:64] FLAG: --tls-min-version="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935087 4797 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935095 4797 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935105 4797 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935113 4797 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935125 4797 flags.go:64] FLAG: --v="2" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935136 4797 flags.go:64] FLAG: --version="false" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935148 4797 flags.go:64] FLAG: --vmodule="" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935159 4797 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.935168 4797 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935472 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935486 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935495 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935504 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935512 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935519 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935527 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935536 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935543 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935554 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935564 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935574 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935583 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935591 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935599 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935606 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935616 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935626 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935635 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935644 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935652 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935660 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935668 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935679 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935688 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935716 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935724 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935735 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935744 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935752 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935760 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935770 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935778 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935786 4797 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935794 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935803 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935811 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935818 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935826 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935834 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935842 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935849 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935859 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935868 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935877 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935884 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935892 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935900 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935908 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935915 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935923 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935930 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935938 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935945 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935953 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935961 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935968 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935979 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935986 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.935994 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936002 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936021 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936029 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936037 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936045 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936053 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936060 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936068 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936075 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936083 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.936091 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.936853 4797 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.952786 4797 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.952857 4797 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953001 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953017 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953024 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953031 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953038 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953045 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953051 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953057 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953063 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953070 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953076 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953085 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953098 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953105 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953111 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953118 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953125 4797 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953131 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953137 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953143 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953150 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953158 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953168 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953175 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953181 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953187 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953194 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953201 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953207 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953213 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953219 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953225 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953235 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953243 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953250 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953257 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953262 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953269 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953275 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953281 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953287 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953293 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953299 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953305 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953313 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953321 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953327 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953333 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953339 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953344 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953350 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953358 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953394 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953402 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953408 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953494 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953501 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953507 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953514 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953521 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953529 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953538 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953544 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953550 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953557 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953564 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953570 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953576 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953582 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953588 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953593 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.953604 4797 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953838 4797 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953854 4797 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953860 4797 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953867 4797 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953876 4797 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953882 4797 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953888 4797 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953893 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953899 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953907 4797 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953917 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953923 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953929 4797 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953935 4797 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953944 4797 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953951 4797 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953957 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953963 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953968 4797 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953974 4797 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953980 4797 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953986 4797 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953992 4797 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.953997 4797 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954003 4797 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954008 4797 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954014 4797 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954020 4797 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954025 4797 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954031 4797 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954036 4797 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954042 4797 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954047 4797 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954052 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954058 4797 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954063 4797 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954069 4797 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954078 4797 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954084 4797 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954090 4797 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954095 4797 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954103 4797 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954110 4797 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954116 4797 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954122 4797 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954128 4797 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954136 4797 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954143 4797 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954149 4797 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954155 4797 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954160 4797 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954166 4797 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954173 4797 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954180 4797 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954185 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954192 4797 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954197 4797 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954203 4797 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954208 4797 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954214 4797 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954220 4797 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954225 4797 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954231 4797 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954237 4797 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954243 4797 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954248 4797 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954256 4797 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954263 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954269 4797 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954275 4797 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:42:29 crc kubenswrapper[4797]: W0930 17:42:29.954281 4797 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.954291 4797 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.954631 4797 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.964011 4797 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.964304 4797 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.967536 4797 server.go:997] "Starting client certificate rotation" Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.967602 4797 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.969420 4797 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-06 13:40:37.469675272 +0000 UTC Sep 30 17:42:29 crc kubenswrapper[4797]: I0930 17:42:29.969569 4797 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1603h58m7.500111402s for next certificate rotation Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.003294 4797 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.006345 4797 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.029921 4797 log.go:25] "Validated CRI v1 runtime API" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.067206 4797 log.go:25] "Validated CRI v1 image API" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.070251 4797 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.079204 4797 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-17-36-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.079273 4797 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.116961 4797 manager.go:217] Machine: {Timestamp:2025-09-30 17:42:30.113291448 +0000 UTC m=+0.635790766 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8f771605-5354-4577-b1b4-ab7637d1e89f BootID:ffa71b44-8856-40bb-9dd0-c146b8624485 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1e:8d:60 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1e:8d:60 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b0:fe:51 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d1:87:df Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ff:b6:75 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fb:3d:fb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a6:40:d4:7a:76:24 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4a:81:17:72:97:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.117474 4797 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.117679 4797 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.122203 4797 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.122673 4797 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.122741 4797 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.123177 4797 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.123198 4797 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.124016 4797 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.124073 4797 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.124377 4797 state_mem.go:36] "Initialized new in-memory state store" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.124561 4797 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.128054 4797 kubelet.go:418] "Attempting to sync node with API server" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.128095 4797 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.128140 4797 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.128165 4797 kubelet.go:324] "Adding apiserver pod source" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.128185 4797 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.134966 4797 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.136868 4797 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.139501 4797 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.139752 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.139870 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.139908 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.140049 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141595 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141642 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141659 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141674 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141696 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141711 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141726 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141768 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141791 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141806 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141827 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.141840 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.142660 4797 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.143472 4797 server.go:1280] "Started kubelet" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.143818 4797 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.143813 4797 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.144727 4797 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.144766 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:30 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.146281 4797 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.146326 4797 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.147580 4797 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.147610 4797 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.147612 4797 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 20:42:27.845914819 +0000 UTC Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.147744 4797 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2378h59m57.698176806s for next certificate rotation Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.147763 4797 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.147893 4797 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.148414 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.148538 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.149375 4797 factory.go:55] Registering systemd factory Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.149422 4797 factory.go:221] Registration of the systemd container factory successfully Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.149355 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="200ms" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.150001 4797 factory.go:153] Registering CRI-O factory Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.150054 4797 factory.go:221] Registration of the crio container factory successfully Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.150180 4797 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.150222 4797 factory.go:103] Registering Raw factory Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.150254 4797 manager.go:1196] Started watching for new ooms in manager Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.156216 4797 manager.go:319] Starting recovery of all containers Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.165755 4797 server.go:460] "Adding debug handlers to kubelet server" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.170476 4797 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a205420edb44f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 17:42:30.143390799 +0000 UTC m=+0.665890067,LastTimestamp:2025-09-30 17:42:30.143390799 +0000 UTC m=+0.665890067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177245 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177389 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177425 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177494 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177524 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177550 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177582 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177609 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177641 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177672 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177701 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177729 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177756 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177788 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177818 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177847 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177894 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177924 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177951 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.177978 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178003 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178031 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178060 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178090 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178121 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178156 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178223 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178255 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178281 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178311 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178338 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178370 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178407 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178467 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178500 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178530 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178561 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178590 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178618 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178645 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178672 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178702 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178731 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178759 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178785 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178812 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178839 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178868 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178902 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178930 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178957 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.178985 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179023 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179054 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179083 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179114 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179144 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179258 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179292 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179323 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179353 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179381 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179409 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179470 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179503 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179532 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179560 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179586 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179612 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179640 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179667 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179699 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179729 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179759 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179787 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179820 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179848 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179876 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179908 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179936 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179966 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.179993 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180021 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180048 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180077 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180104 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180130 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180158 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180186 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180215 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180241 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180268 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180297 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180324 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180352 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180382 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180415 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180481 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180512 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180540 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180567 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180596 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180625 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180654 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180699 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180731 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180764 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180798 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180829 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180862 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180897 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180927 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180959 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.180987 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181013 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181039 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181067 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181095 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181172 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181205 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181236 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181264 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181290 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181323 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181350 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181378 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181407 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181476 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181526 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181553 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181581 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181608 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181635 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181661 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181687 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181714 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181745 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181773 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181798 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181825 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181851 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181878 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181908 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181932 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181961 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.181991 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182019 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182046 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182072 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182096 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182125 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182151 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182177 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182203 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182230 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182256 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182284 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182308 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182341 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182367 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182397 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182423 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182498 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182525 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182554 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182581 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182608 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182635 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182661 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182686 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182712 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182747 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182776 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182802 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182828 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182856 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182890 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182915 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182940 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182965 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.182993 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183022 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183052 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183083 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183118 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183145 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183171 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183199 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183732 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183766 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183797 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183837 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183870 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183898 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183927 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183954 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.183982 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184008 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184034 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184063 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184091 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184118 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184146 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184175 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184204 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184233 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.184262 4797 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.189802 4797 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.189853 4797 reconstruct.go:97] "Volume reconstruction finished" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.189868 4797 reconciler.go:26] "Reconciler: start to sync state" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.193932 4797 manager.go:324] Recovery completed Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.211721 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.214149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.214204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.214223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.215325 4797 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.215361 4797 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.215394 4797 state_mem.go:36] "Initialized new in-memory state store" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.234459 4797 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.236708 4797 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.236784 4797 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.236845 4797 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.236940 4797 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.237913 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.237974 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.247414 4797 policy_none.go:49] "None policy: Start" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.248158 4797 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.248931 4797 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.248990 4797 state_mem.go:35] "Initializing new in-memory state store" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.317934 4797 manager.go:334] "Starting Device Plugin manager" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.318051 4797 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.318076 4797 server.go:79] "Starting device plugin registration server" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.318868 4797 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.318899 4797 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.321512 4797 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.321721 4797 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.321748 4797 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.333027 4797 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.337383 4797 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.337583 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.339410 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.339502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.339530 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.339801 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.340136 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.340206 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.341477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.341536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.341557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.341812 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.341946 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.342008 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.342211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.342266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.342292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.342948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343024 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343233 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343350 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.343395 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344705 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344729 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.344871 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.345118 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.345199 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.345849 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.345901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.345924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.346202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.346247 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.346270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.346218 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.346596 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.347623 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.347663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.347681 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.350554 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="400ms" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.391886 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.391948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392030 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392064 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392188 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392254 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392367 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392472 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392518 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392566 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392599 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392636 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392668 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392699 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.392731 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.419751 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.421764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.421830 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.421850 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.421898 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.423528 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494043 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494150 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494182 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494214 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494258 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494302 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494345 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494388 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494659 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494543 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494575 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494602 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494596 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494603 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494606 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494522 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.494469 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495167 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495237 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495273 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495343 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495405 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495410 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495425 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495486 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.495589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.623838 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.625835 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.625935 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.625956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.625998 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.626678 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.673534 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.683778 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.705198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.720513 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: I0930 17:42:30.729902 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.731037 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3fd5aff9a449892b3f0045dbd974bc5c3ed732e52645f62cc07d83aa49e51f1d WatchSource:0}: Error finding container 3fd5aff9a449892b3f0045dbd974bc5c3ed732e52645f62cc07d83aa49e51f1d: Status 404 returned error can't find the container with id 3fd5aff9a449892b3f0045dbd974bc5c3ed732e52645f62cc07d83aa49e51f1d Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.735137 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-7abf1748778026a4ff1672619e8c2424df7659dc6a43789237aeaf620bb6cbb4 WatchSource:0}: Error finding container 7abf1748778026a4ff1672619e8c2424df7659dc6a43789237aeaf620bb6cbb4: Status 404 returned error can't find the container with id 7abf1748778026a4ff1672619e8c2424df7659dc6a43789237aeaf620bb6cbb4 Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.745156 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3243fa6cd772a909c9950d5bf9d2c7048d8993aed60fa0b0cd3f02d358d1d0b9 WatchSource:0}: Error finding container 3243fa6cd772a909c9950d5bf9d2c7048d8993aed60fa0b0cd3f02d358d1d0b9: Status 404 returned error can't find the container with id 3243fa6cd772a909c9950d5bf9d2c7048d8993aed60fa0b0cd3f02d358d1d0b9 Sep 30 17:42:30 crc kubenswrapper[4797]: E0930 17:42:30.752632 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="800ms" Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.753389 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c8af85896e6a1e42e71c590b606fc681e016ba0519898ce919d1b69e2994c72e WatchSource:0}: Error finding container c8af85896e6a1e42e71c590b606fc681e016ba0519898ce919d1b69e2994c72e: Status 404 returned error can't find the container with id c8af85896e6a1e42e71c590b606fc681e016ba0519898ce919d1b69e2994c72e Sep 30 17:42:30 crc kubenswrapper[4797]: W0930 17:42:30.756577 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-582325c1cc8860b703221212e5803f7a19612bf3015ab83ae7278f6c09d43106 WatchSource:0}: Error finding container 582325c1cc8860b703221212e5803f7a19612bf3015ab83ae7278f6c09d43106: Status 404 returned error can't find the container with id 582325c1cc8860b703221212e5803f7a19612bf3015ab83ae7278f6c09d43106 Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.026841 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.028273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.028317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.028330 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.028357 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.028885 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.145863 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.245283 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3fd5aff9a449892b3f0045dbd974bc5c3ed732e52645f62cc07d83aa49e51f1d"} Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.246662 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"582325c1cc8860b703221212e5803f7a19612bf3015ab83ae7278f6c09d43106"} Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.248083 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c8af85896e6a1e42e71c590b606fc681e016ba0519898ce919d1b69e2994c72e"} Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.249288 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3243fa6cd772a909c9950d5bf9d2c7048d8993aed60fa0b0cd3f02d358d1d0b9"} Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.250315 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7abf1748778026a4ff1672619e8c2424df7659dc6a43789237aeaf620bb6cbb4"} Sep 30 17:42:31 crc kubenswrapper[4797]: W0930 17:42:31.422214 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.422763 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:31 crc kubenswrapper[4797]: W0930 17:42:31.468205 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.468315 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.554225 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="1.6s" Sep 30 17:42:31 crc kubenswrapper[4797]: W0930 17:42:31.633936 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.634050 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:31 crc kubenswrapper[4797]: W0930 17:42:31.718637 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.718735 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.829530 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.831291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.831332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.831347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:31 crc kubenswrapper[4797]: I0930 17:42:31.831377 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:42:31 crc kubenswrapper[4797]: E0930 17:42:31.831931 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.145677 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.255802 4797 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d" exitCode=0 Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.255914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.255953 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.257370 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.257418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.257467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.257920 4797 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3" exitCode=0 Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.257952 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.258062 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.259056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.259107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.259126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.260862 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.260997 4797 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c" exitCode=0 Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.261066 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.261156 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.261973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.262003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.262015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.262642 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.262693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.262710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.263540 4797 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56" exitCode=0 Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.263599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.263618 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.265052 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.265099 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.265118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.269675 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.269726 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.269744 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.269760 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9"} Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.269782 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.270653 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.270686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:32 crc kubenswrapper[4797]: I0930 17:42:32.270697 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.145524 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:33 crc kubenswrapper[4797]: E0930 17:42:33.155909 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="3.2s" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.275273 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.275342 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.275354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.275457 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.276680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.276713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.276726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.279571 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.279623 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.279639 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.279652 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.281850 4797 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63" exitCode=0 Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.281945 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.282089 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.283257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.283301 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.283317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.284263 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.284282 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb"} Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.284314 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.284960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.284992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.285004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.285554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.285604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.285621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.432088 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:33 crc kubenswrapper[4797]: W0930 17:42:33.432897 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:33 crc kubenswrapper[4797]: E0930 17:42:33.432959 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.433350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.433394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.433404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:33 crc kubenswrapper[4797]: I0930 17:42:33.433449 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:42:33 crc kubenswrapper[4797]: E0930 17:42:33.433821 4797 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Sep 30 17:42:34 crc kubenswrapper[4797]: W0930 17:42:34.011910 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:34 crc kubenswrapper[4797]: E0930 17:42:34.011996 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.146397 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.293067 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5"} Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.293155 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.294075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.294130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.294144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.296351 4797 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716" exitCode=0 Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.296418 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716"} Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.296453 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.296529 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.296538 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.296599 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297772 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.297788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.553267 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.553482 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.554744 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.554781 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:34 crc kubenswrapper[4797]: I0930 17:42:34.554793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.307522 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd"} Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.307576 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.307600 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef"} Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.307618 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.307619 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc"} Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.307704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b"} Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.308561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.308602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.308617 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:35 crc kubenswrapper[4797]: I0930 17:42:35.550855 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.317315 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05"} Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.317383 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.317410 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.319581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.319629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.319646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.319699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.319736 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.319755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.634678 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.636628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.636716 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.636741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:36 crc kubenswrapper[4797]: I0930 17:42:36.636789 4797 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.081538 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.081887 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.084279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.084488 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.084511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.093423 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.320017 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.320032 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.321317 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.321366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.321382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.321588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.321627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.321638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.433363 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.554351 4797 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 17:42:37 crc kubenswrapper[4797]: I0930 17:42:37.554490 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.323822 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.325562 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.325620 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.325640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.599762 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.600104 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.601863 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.601937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.601974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:38 crc kubenswrapper[4797]: I0930 17:42:38.798903 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.320953 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.321158 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.322416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.322500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.322521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.325331 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.326560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.326594 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.326605 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.906584 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.906788 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.908255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.908303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:39 crc kubenswrapper[4797]: I0930 17:42:39.908315 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:40 crc kubenswrapper[4797]: E0930 17:42:40.333498 4797 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:42:41 crc kubenswrapper[4797]: I0930 17:42:41.959586 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:41 crc kubenswrapper[4797]: I0930 17:42:41.959805 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:41 crc kubenswrapper[4797]: I0930 17:42:41.961077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:41 crc kubenswrapper[4797]: I0930 17:42:41.961134 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:41 crc kubenswrapper[4797]: I0930 17:42:41.961222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:41 crc kubenswrapper[4797]: I0930 17:42:41.964999 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.146175 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.146406 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.148040 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.148081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.148092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.331957 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.332900 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.332939 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:42 crc kubenswrapper[4797]: I0930 17:42:42.332951 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:44 crc kubenswrapper[4797]: W0930 17:42:44.414138 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 17:42:44 crc kubenswrapper[4797]: I0930 17:42:44.414267 4797 trace.go:236] Trace[2017799341]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:42:34.412) (total time: 10001ms): Sep 30 17:42:44 crc kubenswrapper[4797]: Trace[2017799341]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:42:44.414) Sep 30 17:42:44 crc kubenswrapper[4797]: Trace[2017799341]: [10.00130957s] [10.00130957s] END Sep 30 17:42:44 crc kubenswrapper[4797]: E0930 17:42:44.414296 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 17:42:44 crc kubenswrapper[4797]: W0930 17:42:44.638698 4797 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 17:42:44 crc kubenswrapper[4797]: I0930 17:42:44.638808 4797 trace.go:236] Trace[474576249]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:42:34.636) (total time: 10002ms): Sep 30 17:42:44 crc kubenswrapper[4797]: Trace[474576249]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (17:42:44.638) Sep 30 17:42:44 crc kubenswrapper[4797]: Trace[474576249]: [10.002133396s] [10.002133396s] END Sep 30 17:42:44 crc kubenswrapper[4797]: E0930 17:42:44.638837 4797 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 17:42:45 crc kubenswrapper[4797]: I0930 17:42:45.146576 4797 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 17:42:45 crc kubenswrapper[4797]: I0930 17:42:45.643763 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 17:42:45 crc kubenswrapper[4797]: I0930 17:42:45.643825 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 17:42:45 crc kubenswrapper[4797]: I0930 17:42:45.647655 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 17:42:45 crc kubenswrapper[4797]: I0930 17:42:45.647718 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 17:42:47 crc kubenswrapper[4797]: I0930 17:42:47.554688 4797 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 17:42:47 crc kubenswrapper[4797]: I0930 17:42:47.555697 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.382424 4797 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.607047 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.607519 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.609060 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.609231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.609305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:48 crc kubenswrapper[4797]: I0930 17:42:48.612673 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:49 crc kubenswrapper[4797]: I0930 17:42:49.351493 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:42:49 crc kubenswrapper[4797]: I0930 17:42:49.351555 4797 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:42:49 crc kubenswrapper[4797]: I0930 17:42:49.352742 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:49 crc kubenswrapper[4797]: I0930 17:42:49.352838 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:49 crc kubenswrapper[4797]: I0930 17:42:49.352867 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.105433 4797 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.333775 4797 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.636930 4797 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.637279 4797 trace.go:236] Trace[1100834382]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:42:38.395) (total time: 12241ms): Sep 30 17:42:50 crc kubenswrapper[4797]: Trace[1100834382]: ---"Objects listed" error: 12241ms (17:42:50.637) Sep 30 17:42:50 crc kubenswrapper[4797]: Trace[1100834382]: [12.241902687s] [12.241902687s] END Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.637332 4797 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.637834 4797 trace.go:236] Trace[1810089013]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:42:38.837) (total time: 11800ms): Sep 30 17:42:50 crc kubenswrapper[4797]: Trace[1810089013]: ---"Objects listed" error: 11800ms (17:42:50.637) Sep 30 17:42:50 crc kubenswrapper[4797]: Trace[1810089013]: [11.800625473s] [11.800625473s] END Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.637869 4797 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.639748 4797 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.649300 4797 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.649485 4797 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.651182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.651245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.651265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.651297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.651318 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.669512 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.673570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.673622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.673638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.673668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.673686 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.680009 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.680022 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45328->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.680076 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.680188 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45328->192.168.126.11:17697: read: connection reset by peer" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.682722 4797 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50932->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.682773 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50932->192.168.126.11:17697: read: connection reset by peer" Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.686169 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.696292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.696483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.696508 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.696553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.696607 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.712816 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.717711 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.717877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.717933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.717974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.717999 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.728658 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.733000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.733048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.733058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.733079 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.733093 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.743313 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:50 crc kubenswrapper[4797]: E0930 17:42:50.743490 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.745234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.745306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.745322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.745359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.745375 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.848145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.848195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.848208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.848239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.848254 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.950328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.950381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.950395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.950418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:50 crc kubenswrapper[4797]: I0930 17:42:50.950433 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:50Z","lastTransitionTime":"2025-09-30T17:42:50Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.052811 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.052866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.052879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.052904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.052920 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.142254 4797 apiserver.go:52] "Watching apiserver" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.145364 4797 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.145617 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.145972 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.146014 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.146040 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.146104 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.146157 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.146273 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.146307 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.146607 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.146722 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.148721 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.148921 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.148964 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.151934 4797 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.153598 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.153905 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.154062 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.159367 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.159768 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.160517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.160567 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.160583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.160608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.160625 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.163094 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.185835 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.200473 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.212205 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.223653 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.236590 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243732 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243751 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243769 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243785 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243802 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243818 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243835 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243856 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243872 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243888 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243907 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243922 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243940 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243960 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243980 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.243999 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244021 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244039 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244160 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244174 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244202 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244226 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244254 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244274 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244302 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244318 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244336 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244353 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244371 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244390 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.244475 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:42:51.744422053 +0000 UTC m=+22.266921281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244570 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244621 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244666 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244679 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244713 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244745 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244772 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244797 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244822 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244850 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244875 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244905 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244930 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244955 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244982 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245004 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245031 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245060 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245095 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245122 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245151 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245179 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245206 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245233 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245260 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245289 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245319 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245345 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245378 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245458 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245486 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245516 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245540 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245565 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245591 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245624 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245657 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245724 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245756 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245783 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245809 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245838 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245871 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245900 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245926 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245954 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245986 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246017 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246050 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246079 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246106 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246132 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246161 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246187 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246215 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246264 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246292 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246320 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246346 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246371 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246397 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246419 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246528 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246570 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246594 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246620 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246666 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246690 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246718 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246742 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246768 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246792 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246822 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246852 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246880 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246907 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246931 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246958 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246985 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247008 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247036 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247063 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247089 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247113 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247138 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247164 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247191 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247219 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247244 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247271 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247299 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247329 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247354 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247380 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247408 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247454 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247510 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247533 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247558 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247582 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247605 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247628 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247653 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247709 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247734 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247787 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247813 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247841 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247893 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247920 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247961 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247989 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248013 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248039 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248066 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248094 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248121 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248153 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248183 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248214 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248241 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248269 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248297 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248324 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248351 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248377 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248402 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248430 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248479 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248507 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248551 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248578 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248606 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248649 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248678 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248704 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248731 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248758 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248786 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248813 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248846 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248875 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248904 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248930 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248959 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248985 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249011 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249035 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249059 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249087 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249114 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249137 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249166 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249193 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249222 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249249 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249276 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249346 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249383 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249416 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249471 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249502 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249532 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249620 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249646 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249674 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249701 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249730 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249758 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249868 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249888 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249911 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244759 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244875 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244917 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244950 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.244983 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245060 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245049 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245054 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245182 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245299 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245330 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245359 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245413 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245574 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245656 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245715 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245720 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245745 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245838 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.245949 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246033 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246267 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246314 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246452 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246631 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246804 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.246933 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.247018 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248112 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248210 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248235 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248240 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248386 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248565 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248707 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248777 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.248813 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249098 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249127 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249208 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249240 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249742 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.249793 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.249987 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.253756 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:51.753725209 +0000 UTC m=+22.276224447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.254101 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.254408 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.254708 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.254745 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.254923 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255107 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255132 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255137 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250143 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250395 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250479 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250750 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250784 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250912 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251129 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251287 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251367 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251375 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251478 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251598 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251757 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251850 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251971 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.251987 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.252033 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.252309 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.252482 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.252617 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.252742 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255454 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.252846 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250333 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255623 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255905 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.255944 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.256406 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.256562 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.256781 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.257240 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.257273 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.257740 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.258184 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.258222 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.258626 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.259853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.259994 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.260426 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.261803 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.261953 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262093 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262126 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262132 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262338 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262537 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262570 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262607 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262626 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262971 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.262856 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.263082 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.263123 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.250115 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.263299 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.263182 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.263600 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264098 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264104 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264160 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264189 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264325 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264003 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264562 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264647 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.264289 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.265076 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.265342 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.265582 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:51.765545161 +0000 UTC m=+22.288044639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.265639 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.266267 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.266415 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.266645 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.266816 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.266814 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.267015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.267214 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.267816 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.267820 4797 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.268495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.266400 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.271102 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.271236 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.271695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.271797 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.271866 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.273795 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.282326 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.282378 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.282397 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.282513 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:51.782487084 +0000 UTC m=+22.304986312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.283116 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.283148 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.283163 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.283245 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:51.783230837 +0000 UTC m=+22.305730075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.285911 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.292851 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.293034 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.293029 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.293164 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.293317 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.293397 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.294300 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.294639 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.294773 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.294931 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.296156 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.297030 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.297349 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.297609 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.299212 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.303136 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.304263 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.312709 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315383 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315510 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315557 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315658 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315719 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315737 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.315910 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.316026 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.316270 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.316585 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.316622 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.316759 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.316835 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.317021 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.317526 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.318062 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.318132 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.318722 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.318963 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.319153 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.319284 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.319257 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.319787 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.319928 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.321623 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.321794 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.321818 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.322188 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.322387 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.322440 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.322192 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.322466 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.322902 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323119 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323127 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323173 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323192 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323202 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323497 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.323979 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.324674 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.325218 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.325309 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.325427 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.325764 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.327999 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.330827 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.340679 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.346743 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.349652 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350643 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350704 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350760 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350773 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350784 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350793 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350801 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350810 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350810 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350819 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350886 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350899 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350842 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350913 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350954 4797 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350969 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350981 4797 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.350992 4797 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351025 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351040 4797 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351052 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351064 4797 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351075 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351108 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351121 4797 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351133 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351145 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351157 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351192 4797 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351205 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351218 4797 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351231 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351244 4797 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351280 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351292 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351308 4797 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351320 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351356 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351371 4797 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351384 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351396 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351446 4797 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351463 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351475 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351488 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351531 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351544 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351557 4797 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351570 4797 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351609 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351623 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351637 4797 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351650 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351685 4797 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351698 4797 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351712 4797 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351727 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351770 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351786 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351799 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351811 4797 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351821 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351859 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351870 4797 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351882 4797 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351893 4797 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351929 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351945 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351958 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351972 4797 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.351984 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.352034 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.352049 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354211 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354248 4797 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354266 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354280 4797 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354291 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354323 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354353 4797 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354365 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354392 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354406 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354418 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354439 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354451 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354462 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354472 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354482 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354492 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354504 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354515 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354526 4797 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354537 4797 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354547 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354558 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354571 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354589 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354601 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354611 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354620 4797 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354630 4797 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354641 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354652 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354663 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354673 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354686 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354697 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354707 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354718 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354728 4797 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354738 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354750 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354763 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354777 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354788 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354800 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354810 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354821 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354834 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354844 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354855 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354865 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354874 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354884 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354894 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354910 4797 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354923 4797 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354936 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354950 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354961 4797 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354972 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354982 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.354992 4797 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355004 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355015 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355026 4797 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355036 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355049 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355061 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355072 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355083 4797 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355093 4797 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355104 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355116 4797 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355127 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355138 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355149 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355163 4797 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355174 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355184 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355195 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355204 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355214 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355225 4797 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355234 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355247 4797 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355258 4797 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355268 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355279 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355289 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355300 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355310 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355309 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355321 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355381 4797 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355398 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355412 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355422 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355452 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355462 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355472 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355482 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355492 4797 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355501 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355511 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355521 4797 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355530 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355540 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355551 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355559 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.355568 4797 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.356231 4797 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.356248 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.356259 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.356293 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358184 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358197 4797 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358208 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358219 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358252 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358264 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358276 4797 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358328 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.358339 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.362055 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.364060 4797 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5" exitCode=255 Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.364125 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.374902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.374954 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.374971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.374999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.375016 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.375598 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.384339 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.409150 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.422417 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.434033 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.443684 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.459406 4797 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.463311 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.473515 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.477324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.477368 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.477378 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.477445 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.477464 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.479958 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:42:51 crc kubenswrapper[4797]: W0930 17:42:51.481265 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-3660a338222a8e03748da2040ec8aa78836bb5c7c9e9d5a49d53d3728bbacac8 WatchSource:0}: Error finding container 3660a338222a8e03748da2040ec8aa78836bb5c7c9e9d5a49d53d3728bbacac8: Status 404 returned error can't find the container with id 3660a338222a8e03748da2040ec8aa78836bb5c7c9e9d5a49d53d3728bbacac8 Sep 30 17:42:51 crc kubenswrapper[4797]: W0930 17:42:51.488275 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f0104aa5fbf236f3de7db79745952105be813095849abe2518ef7c3233c13fad WatchSource:0}: Error finding container f0104aa5fbf236f3de7db79745952105be813095849abe2518ef7c3233c13fad: Status 404 returned error can't find the container with id f0104aa5fbf236f3de7db79745952105be813095849abe2518ef7c3233c13fad Sep 30 17:42:51 crc kubenswrapper[4797]: W0930 17:42:51.497554 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-29c22e8779c0f0b39b7e8aa7d318e6bffa5b7815976fae38f735b57b2dc75d6e WatchSource:0}: Error finding container 29c22e8779c0f0b39b7e8aa7d318e6bffa5b7815976fae38f735b57b2dc75d6e: Status 404 returned error can't find the container with id 29c22e8779c0f0b39b7e8aa7d318e6bffa5b7815976fae38f735b57b2dc75d6e Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.579633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.579710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.579726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.579751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.579773 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.591105 4797 scope.go:117] "RemoveContainer" containerID="f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.592424 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.682174 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.682215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.682227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.682247 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.682258 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.762131 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.762546 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:42:52.76251256 +0000 UTC m=+23.285011798 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.762653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.762777 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.762845 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:52.762813419 +0000 UTC m=+23.285312657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.785854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.786302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.786319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.786347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.786369 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.863953 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.864016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.864093 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.864282 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865022 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865046 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865123 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:52.865100728 +0000 UTC m=+23.387600146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.864309 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865288 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865305 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.864314 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865411 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:52.865391167 +0000 UTC m=+23.387890405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:51 crc kubenswrapper[4797]: E0930 17:42:51.865425 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:52.865419137 +0000 UTC m=+23.387918375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.890154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.890205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.890216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.890235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.890247 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.992709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.992735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.992743 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.992758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:51 crc kubenswrapper[4797]: I0930 17:42:51.992767 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:51Z","lastTransitionTime":"2025-09-30T17:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.095398 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.095441 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.095452 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.095468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.095478 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.198394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.198460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.198473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.198490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.198500 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.208166 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.224702 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.231383 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.232051 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.241035 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.241675 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.243109 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.243297 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.243865 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.244974 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.245619 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.246306 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.247650 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.248342 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.249488 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.252888 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.253577 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.255329 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.256018 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.256895 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.257503 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.258081 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.260062 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.260912 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.261808 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.261915 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.264191 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.265583 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.267108 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.268631 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.269657 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.270464 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.272085 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.272728 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.273494 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.274710 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.274902 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.275741 4797 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.275872 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.286900 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.298543 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.298628 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.300800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.300931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.301055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.301194 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.301313 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.301661 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.302661 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.305634 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.307283 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.308174 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.310169 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.311720 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.313493 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.314398 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.316201 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.317277 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.318870 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.319312 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.319690 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.321185 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.322310 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.324044 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.324894 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.326314 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.327290 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.328371 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.330327 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.331294 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.342001 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.353097 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.364050 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.367696 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.367764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3660a338222a8e03748da2040ec8aa78836bb5c7c9e9d5a49d53d3728bbacac8"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.369656 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.371376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.372704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.372750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"29c22e8779c0f0b39b7e8aa7d318e6bffa5b7815976fae38f735b57b2dc75d6e"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.374331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f0104aa5fbf236f3de7db79745952105be813095849abe2518ef7c3233c13fad"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.377129 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.394546 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.404115 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.404174 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.404187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.404212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.404223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.407837 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.422596 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.508167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.508207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.508220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.508244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.508258 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.611479 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.611527 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.611544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.611567 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.611581 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.715062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.715111 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.715124 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.715142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.715155 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.745814 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hprkg"] Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.746176 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.748373 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.748526 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.748702 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.762808 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.771481 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.771672 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02901280-033f-4eb8-91bd-c1a5ba1358c0-hosts-file\") pod \"node-resolver-hprkg\" (UID: \"02901280-033f-4eb8-91bd-c1a5ba1358c0\") " pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.771726 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwpd\" (UniqueName: \"kubernetes.io/projected/02901280-033f-4eb8-91bd-c1a5ba1358c0-kube-api-access-jcwpd\") pod \"node-resolver-hprkg\" (UID: \"02901280-033f-4eb8-91bd-c1a5ba1358c0\") " pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.771810 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.771950 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.772045 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:54.77201898 +0000 UTC m=+25.294518228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.772374 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:42:54.772301649 +0000 UTC m=+25.294800927 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.786026 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.801431 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.817308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.817363 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.817415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.817471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.817493 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.833797 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.846103 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.856923 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.868323 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.872132 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.872168 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02901280-033f-4eb8-91bd-c1a5ba1358c0-hosts-file\") pod \"node-resolver-hprkg\" (UID: \"02901280-033f-4eb8-91bd-c1a5ba1358c0\") " pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.872191 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.872212 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.872237 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcwpd\" (UniqueName: \"kubernetes.io/projected/02901280-033f-4eb8-91bd-c1a5ba1358c0-kube-api-access-jcwpd\") pod \"node-resolver-hprkg\" (UID: \"02901280-033f-4eb8-91bd-c1a5ba1358c0\") " pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872312 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872372 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:54.872355682 +0000 UTC m=+25.394854920 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.872428 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02901280-033f-4eb8-91bd-c1a5ba1358c0-hosts-file\") pod \"node-resolver-hprkg\" (UID: \"02901280-033f-4eb8-91bd-c1a5ba1358c0\") " pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872533 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872588 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872610 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872533 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872687 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872703 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:54.872669702 +0000 UTC m=+25.395169120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872704 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:52 crc kubenswrapper[4797]: E0930 17:42:52.872768 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:54.872759184 +0000 UTC m=+25.395258442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.879470 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.891096 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.896062 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcwpd\" (UniqueName: \"kubernetes.io/projected/02901280-033f-4eb8-91bd-c1a5ba1358c0-kube-api-access-jcwpd\") pod \"node-resolver-hprkg\" (UID: \"02901280-033f-4eb8-91bd-c1a5ba1358c0\") " pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.919777 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.919817 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.919827 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.919843 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:52 crc kubenswrapper[4797]: I0930 17:42:52.919857 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:52Z","lastTransitionTime":"2025-09-30T17:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.022917 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.022962 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.022972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.022991 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.023001 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.073560 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hprkg" Sep 30 17:42:53 crc kubenswrapper[4797]: W0930 17:42:53.086453 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02901280_033f_4eb8_91bd_c1a5ba1358c0.slice/crio-ef47e7bcd77bab4776b1df164bff9feb1532cfae70994c10e133449e06fdc796 WatchSource:0}: Error finding container ef47e7bcd77bab4776b1df164bff9feb1532cfae70994c10e133449e06fdc796: Status 404 returned error can't find the container with id ef47e7bcd77bab4776b1df164bff9feb1532cfae70994c10e133449e06fdc796 Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.125062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.125108 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.125119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.125138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.125149 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.133037 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b8bg9"] Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.133475 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.135977 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-w74xm"] Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.136489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.137362 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.137806 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.137942 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.137989 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.138268 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.138517 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.139905 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.140156 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.140575 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.140965 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.141879 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4zbp6"] Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.142645 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.144319 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.144640 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.149804 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.167840 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175138 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-cnibin\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba20a5a-9a27-4df1-899d-a107aef7a231-cni-binary-copy\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175393 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ec455803-9758-4ad4-a627-ce3ad63812c2-rootfs\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175506 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfj8r\" (UniqueName: \"kubernetes.io/projected/ec455803-9758-4ad4-a627-ce3ad63812c2-kube-api-access-rfj8r\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175613 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmnnd\" (UniqueName: \"kubernetes.io/projected/aba20a5a-9a27-4df1-899d-a107aef7a231-kube-api-access-tmnnd\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175735 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-hostroot\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175837 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-system-cni-dir\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.175935 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176033 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5be80c8f-41bd-41be-a86f-8c69e7655592-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176141 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec455803-9758-4ad4-a627-ce3ad63812c2-proxy-tls\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176233 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec455803-9758-4ad4-a627-ce3ad63812c2-mcd-auth-proxy-config\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176329 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-os-release\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176421 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-cnibin\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176543 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-k8s-cni-cncf-io\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176650 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-cni-multus\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176751 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5be80c8f-41bd-41be-a86f-8c69e7655592-cni-binary-copy\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176840 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-conf-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.176943 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-multus-certs\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177040 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-socket-dir-parent\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177143 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-os-release\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177244 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5k26\" (UniqueName: \"kubernetes.io/projected/5be80c8f-41bd-41be-a86f-8c69e7655592-kube-api-access-l5k26\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177335 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-cni-bin\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177441 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-kubelet\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177527 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-netns\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177624 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-etc-kubernetes\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177780 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-daemon-config\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177833 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-cni-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.177862 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-system-cni-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.184626 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.197451 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.210242 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.221728 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.231469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.231528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.231541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.231563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.231579 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.233131 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.237826 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.237882 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.237974 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:53 crc kubenswrapper[4797]: E0930 17:42:53.238250 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:42:53 crc kubenswrapper[4797]: E0930 17:42:53.238946 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:42:53 crc kubenswrapper[4797]: E0930 17:42:53.239184 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.254675 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.266534 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.278219 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.278921 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-daemon-config\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.278962 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-cni-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.278987 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-system-cni-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279012 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-cnibin\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279034 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba20a5a-9a27-4df1-899d-a107aef7a231-cni-binary-copy\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279056 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ec455803-9758-4ad4-a627-ce3ad63812c2-rootfs\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279082 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfj8r\" (UniqueName: \"kubernetes.io/projected/ec455803-9758-4ad4-a627-ce3ad63812c2-kube-api-access-rfj8r\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279104 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmnnd\" (UniqueName: \"kubernetes.io/projected/aba20a5a-9a27-4df1-899d-a107aef7a231-kube-api-access-tmnnd\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279145 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-hostroot\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279170 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-system-cni-dir\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279196 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5be80c8f-41bd-41be-a86f-8c69e7655592-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279242 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec455803-9758-4ad4-a627-ce3ad63812c2-proxy-tls\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279262 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec455803-9758-4ad4-a627-ce3ad63812c2-mcd-auth-proxy-config\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279284 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-os-release\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279306 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-cnibin\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279337 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-k8s-cni-cncf-io\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279365 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-cni-multus\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279388 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5be80c8f-41bd-41be-a86f-8c69e7655592-cni-binary-copy\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279408 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-conf-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279433 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-multus-certs\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279475 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-socket-dir-parent\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279511 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-os-release\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5k26\" (UniqueName: \"kubernetes.io/projected/5be80c8f-41bd-41be-a86f-8c69e7655592-kube-api-access-l5k26\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279560 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-cni-bin\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279583 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-kubelet\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279606 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-netns\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.279627 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-etc-kubernetes\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.281297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec455803-9758-4ad4-a627-ce3ad63812c2-mcd-auth-proxy-config\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.281553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-etc-kubernetes\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.281685 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-os-release\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.281731 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-cnibin\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.281759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-k8s-cni-cncf-io\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.281790 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-cni-multus\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282353 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-daemon-config\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282422 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5be80c8f-41bd-41be-a86f-8c69e7655592-cni-binary-copy\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-conf-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282573 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-multus-certs\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282653 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-system-cni-dir\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282723 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-socket-dir-parent\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282769 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-os-release\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282923 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-hostroot\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.282950 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-multus-cni-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283012 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-system-cni-dir\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283040 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-cnibin\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283078 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-kubelet\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283100 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-run-netns\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283111 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba20a5a-9a27-4df1-899d-a107aef7a231-host-var-lib-cni-bin\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283129 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ec455803-9758-4ad4-a627-ce3ad63812c2-rootfs\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283743 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5be80c8f-41bd-41be-a86f-8c69e7655592-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283852 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5be80c8f-41bd-41be-a86f-8c69e7655592-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.283891 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba20a5a-9a27-4df1-899d-a107aef7a231-cni-binary-copy\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.289020 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec455803-9758-4ad4-a627-ce3ad63812c2-proxy-tls\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.303844 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.309832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfj8r\" (UniqueName: \"kubernetes.io/projected/ec455803-9758-4ad4-a627-ce3ad63812c2-kube-api-access-rfj8r\") pod \"machine-config-daemon-b8bg9\" (UID: \"ec455803-9758-4ad4-a627-ce3ad63812c2\") " pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.315329 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmnnd\" (UniqueName: \"kubernetes.io/projected/aba20a5a-9a27-4df1-899d-a107aef7a231-kube-api-access-tmnnd\") pod \"multus-w74xm\" (UID: \"aba20a5a-9a27-4df1-899d-a107aef7a231\") " pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.322298 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5k26\" (UniqueName: \"kubernetes.io/projected/5be80c8f-41bd-41be-a86f-8c69e7655592-kube-api-access-l5k26\") pod \"multus-additional-cni-plugins-4zbp6\" (UID: \"5be80c8f-41bd-41be-a86f-8c69e7655592\") " pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.330889 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.338622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.338659 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.338667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.338684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.338694 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.360733 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.381927 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.386157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.387695 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hprkg" event={"ID":"02901280-033f-4eb8-91bd-c1a5ba1358c0","Type":"ContainerStarted","Data":"d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.387737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hprkg" event={"ID":"02901280-033f-4eb8-91bd-c1a5ba1358c0","Type":"ContainerStarted","Data":"ef47e7bcd77bab4776b1df164bff9feb1532cfae70994c10e133449e06fdc796"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.388343 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.407206 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.423710 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.437696 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.441925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.441978 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.441989 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.442008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.442022 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.450859 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.451144 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.459251 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w74xm" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.466004 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.472652 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: W0930 17:42:53.474278 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaba20a5a_9a27_4df1_899d_a107aef7a231.slice/crio-9b15190241ab0dce6d57dbf1ec011cdba5c7ba17acfffff75853de1a173cea98 WatchSource:0}: Error finding container 9b15190241ab0dce6d57dbf1ec011cdba5c7ba17acfffff75853de1a173cea98: Status 404 returned error can't find the container with id 9b15190241ab0dce6d57dbf1ec011cdba5c7ba17acfffff75853de1a173cea98 Sep 30 17:42:53 crc kubenswrapper[4797]: W0930 17:42:53.488843 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be80c8f_41bd_41be_a86f_8c69e7655592.slice/crio-3af2e6e148203e48b088eb9c07ec5f9cd5f200ed4dc2d602b017d16ece0a0f43 WatchSource:0}: Error finding container 3af2e6e148203e48b088eb9c07ec5f9cd5f200ed4dc2d602b017d16ece0a0f43: Status 404 returned error can't find the container with id 3af2e6e148203e48b088eb9c07ec5f9cd5f200ed4dc2d602b017d16ece0a0f43 Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.495537 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.510014 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.524864 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.526063 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g447b"] Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.526882 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.531666 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.531691 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.537871 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.538074 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.538241 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.538456 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.538688 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.547111 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.547160 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.547174 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.547197 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.547212 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.553706 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.571574 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583153 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-netns\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583175 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-script-lib\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583194 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-kubelet\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-ovn-kubernetes\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583243 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-ovn\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583259 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-slash\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583272 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-node-log\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583287 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-etc-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-env-overrides\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583320 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-netd\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583336 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-systemd-units\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583351 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-systemd\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583367 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-var-lib-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583383 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-config\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583400 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c749a60-66ac-44d6-955f-a3d050b12758-ovn-node-metrics-cert\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583445 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-log-socket\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583467 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72jb\" (UniqueName: \"kubernetes.io/projected/4c749a60-66ac-44d6-955f-a3d050b12758-kube-api-access-t72jb\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583491 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.583510 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-bin\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.587362 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.601504 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.622646 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.642208 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.654619 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.654685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.654700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.654722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.654739 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.660954 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.677160 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.684949 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-etc-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685005 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-netd\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-env-overrides\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685053 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-systemd-units\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685073 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-systemd\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685095 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-var-lib-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685119 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-config\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685148 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c749a60-66ac-44d6-955f-a3d050b12758-ovn-node-metrics-cert\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685177 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-systemd-units\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685205 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-log-socket\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685224 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t72jb\" (UniqueName: \"kubernetes.io/projected/4c749a60-66ac-44d6-955f-a3d050b12758-kube-api-access-t72jb\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685241 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685257 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-bin\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685283 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685300 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-netns\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685315 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-script-lib\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685342 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-kubelet\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685364 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-ovn\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-ovn-kubernetes\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685394 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-slash\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-node-log\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685544 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-node-log\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685588 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-systemd\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685614 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-var-lib-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686154 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-env-overrides\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685256 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-netd\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.685108 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-etc-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686351 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-config\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686401 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686429 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-netns\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686807 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-ovn\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686893 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-kubelet\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-script-lib\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686983 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-slash\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.687010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-log-socket\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.686991 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-ovn-kubernetes\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.687035 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-openvswitch\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.687064 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-bin\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.695218 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c749a60-66ac-44d6-955f-a3d050b12758-ovn-node-metrics-cert\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.697943 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.705640 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72jb\" (UniqueName: \"kubernetes.io/projected/4c749a60-66ac-44d6-955f-a3d050b12758-kube-api-access-t72jb\") pod \"ovnkube-node-g447b\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.711194 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.727793 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.753756 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.758667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.758722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.758733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.758750 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.758762 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.785283 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.804229 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.824723 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.847529 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.848643 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: W0930 17:42:53.859971 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c749a60_66ac_44d6_955f_a3d050b12758.slice/crio-7e2e978dc307315b100e524c1dfeee0884affbca1c1bbb630bf16b2e1e1b69c4 WatchSource:0}: Error finding container 7e2e978dc307315b100e524c1dfeee0884affbca1c1bbb630bf16b2e1e1b69c4: Status 404 returned error can't find the container with id 7e2e978dc307315b100e524c1dfeee0884affbca1c1bbb630bf16b2e1e1b69c4 Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.861306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.861413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.861536 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.861647 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.861738 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.866028 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.887502 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.905665 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.921515 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.936840 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.951897 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.965157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.965221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.965233 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.965254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.965266 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:53Z","lastTransitionTime":"2025-09-30T17:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.967549 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:53 crc kubenswrapper[4797]: I0930 17:42:53.983078 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.004841 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.068505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.068576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.068593 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.068620 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.068641 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.174204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.174250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.174280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.174298 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.174312 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.277676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.278610 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.278791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.278889 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.278995 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.381631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.382032 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.382105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.382195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.382272 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.392768 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.394492 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" exitCode=0 Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.394567 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.394605 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"7e2e978dc307315b100e524c1dfeee0884affbca1c1bbb630bf16b2e1e1b69c4"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.397460 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be80c8f-41bd-41be-a86f-8c69e7655592" containerID="d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb" exitCode=0 Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.397598 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerDied","Data":"d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.397695 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerStarted","Data":"3af2e6e148203e48b088eb9c07ec5f9cd5f200ed4dc2d602b017d16ece0a0f43"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.400556 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerStarted","Data":"df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.400595 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerStarted","Data":"9b15190241ab0dce6d57dbf1ec011cdba5c7ba17acfffff75853de1a173cea98"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.402999 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.403052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.403065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"baf06f430ee2ccfa9734b83153692256965cc1742c04141e35c8880f5fec4753"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.413746 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.432726 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.453554 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.474196 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.486003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.486044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.486055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.486073 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.486086 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.490077 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.515067 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.529731 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.548051 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.565998 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.571882 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.573103 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.580682 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.592426 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.593234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.593289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.593303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.593320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.593333 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.611905 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.628379 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.641024 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.656823 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.669659 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.686481 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.697029 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.697214 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.697294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.697363 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.697451 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.700417 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.721379 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.746088 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.770821 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.785831 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.800105 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.800163 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.800282 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:42:58.800253106 +0000 UTC m=+29.322752344 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.800804 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.800989 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.801067 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:58.80105358 +0000 UTC m=+29.323553008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.802872 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.802923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.802937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.802959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.802973 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.817347 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.831361 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.844529 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.870567 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.902136 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.902638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.902660 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.902506 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:54Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.902383 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.902883 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:58.902868146 +0000 UTC m=+29.425367384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.902809 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.902963 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.902978 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.903037 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:58.90301848 +0000 UTC m=+29.425517718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.902823 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.903062 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.903070 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:54 crc kubenswrapper[4797]: E0930 17:42:54.903093 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:42:58.903087242 +0000 UTC m=+29.425586480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.904963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.904995 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.905004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.905020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:54 crc kubenswrapper[4797]: I0930 17:42:54.905028 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:54Z","lastTransitionTime":"2025-09-30T17:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.007885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.007933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.007944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.007969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.007979 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.110773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.110826 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.110841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.110859 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.110870 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.213930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.214195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.214260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.214330 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.214404 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.237732 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.237733 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:55 crc kubenswrapper[4797]: E0930 17:42:55.237889 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.237766 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:55 crc kubenswrapper[4797]: E0930 17:42:55.238014 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:42:55 crc kubenswrapper[4797]: E0930 17:42:55.238079 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.317181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.317250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.317262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.317283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.317379 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.413700 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.414275 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.414294 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.414305 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.416883 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be80c8f-41bd-41be-a86f-8c69e7655592" containerID="adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f" exitCode=0 Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.417480 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerDied","Data":"adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.421349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.421393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.421404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.421425 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.421450 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.442881 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.457618 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.485504 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.498294 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.516582 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.523815 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.523852 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.523861 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.523878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.523890 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.538378 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.553983 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.569935 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.584849 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.610779 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.625578 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.628043 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.628091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.628105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.628124 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.628136 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.644574 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.660761 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.676874 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:55Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.731839 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.731882 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.731894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.731912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.731924 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.834398 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.834454 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.834466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.834514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.834525 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.937955 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.938032 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.938056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.938091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:55 crc kubenswrapper[4797]: I0930 17:42:55.938116 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:55Z","lastTransitionTime":"2025-09-30T17:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.041552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.041614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.041633 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.041660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.041680 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.145028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.145106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.145130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.145164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.145188 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.247006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.247041 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.247051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.247064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.247073 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.349785 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.349825 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.349839 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.349857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.349871 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.424215 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.424273 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.426505 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be80c8f-41bd-41be-a86f-8c69e7655592" containerID="3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f" exitCode=0 Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.426565 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerDied","Data":"3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.448890 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.453099 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.453590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.453605 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.453628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.453642 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.467491 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.489179 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.518320 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.541211 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.558307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.558352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.558366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.558387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.558401 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.559280 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.576057 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.601152 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.617073 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.633943 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.650133 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.660715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.660758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.660773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.660795 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.660808 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.661492 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.677356 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.693023 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:56Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.764203 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.764244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.764256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.764272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.764284 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.868083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.868483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.868638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.868824 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.868947 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.973893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.973934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.973950 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.973969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:56 crc kubenswrapper[4797]: I0930 17:42:56.973982 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:56Z","lastTransitionTime":"2025-09-30T17:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.077106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.077325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.077413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.077541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.077616 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.180162 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.180228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.180251 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.180281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.180303 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.237384 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.237424 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.237389 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:57 crc kubenswrapper[4797]: E0930 17:42:57.237672 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:42:57 crc kubenswrapper[4797]: E0930 17:42:57.237783 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:42:57 crc kubenswrapper[4797]: E0930 17:42:57.237965 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.283987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.284062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.284081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.284132 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.284171 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.387610 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.387708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.387723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.387748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.387760 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.435965 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be80c8f-41bd-41be-a86f-8c69e7655592" containerID="6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c" exitCode=0 Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.436042 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerDied","Data":"6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.465830 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.484357 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.490775 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.490805 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.490819 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.490837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.490849 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.502155 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.517362 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.539675 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.555345 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.570883 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.588021 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.594655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.594700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.594714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.594738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.594755 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.606271 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.641039 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.657708 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.673377 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.693775 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.699682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.699731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.699745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.699768 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.699788 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.716531 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:57Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.802108 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.802336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.802420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.802557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.802629 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.906288 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.906339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.906354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.906379 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:57 crc kubenswrapper[4797]: I0930 17:42:57.906394 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:57Z","lastTransitionTime":"2025-09-30T17:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.009816 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.010224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.010368 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.010598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.010766 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.114292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.114359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.114375 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.114400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.114419 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.217699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.217765 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.217783 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.217807 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.217824 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.320226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.320607 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.320791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.320934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.321081 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.424311 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.424361 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.424374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.424395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.424409 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.444010 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be80c8f-41bd-41be-a86f-8c69e7655592" containerID="cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa" exitCode=0 Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.444107 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerDied","Data":"cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.451495 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.466009 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.488544 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.515641 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.527648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.527731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.527749 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.527770 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.527787 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.538916 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.565116 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.596318 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.618793 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.630113 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.630156 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.630166 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.630183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.630198 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.633818 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.648349 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.662882 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.678538 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.696168 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.711553 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.726859 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.735345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.735421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.735456 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.735505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.735520 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.839669 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.839723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.839761 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.839779 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.839791 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.848339 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.848505 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.848661 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.848733 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:43:06.848700259 +0000 UTC m=+37.371199497 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.848800 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:06.848761161 +0000 UTC m=+37.371260389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.942987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.943046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.943056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.943075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.943090 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:58Z","lastTransitionTime":"2025-09-30T17:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.949753 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.949825 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:58 crc kubenswrapper[4797]: I0930 17:42:58.949875 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950004 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950069 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950079 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950089 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950087 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950161 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:06.950135663 +0000 UTC m=+37.472634941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950169 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950185 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950193 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:06.950178596 +0000 UTC m=+37.472677874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:58 crc kubenswrapper[4797]: E0930 17:42:58.950238 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:06.950212997 +0000 UTC m=+37.472712415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.046871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.046946 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.046964 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.046999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.047019 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.150042 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.150091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.150105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.150125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.150138 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.237691 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.237769 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.237851 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:42:59 crc kubenswrapper[4797]: E0930 17:42:59.237978 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:42:59 crc kubenswrapper[4797]: E0930 17:42:59.238482 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:42:59 crc kubenswrapper[4797]: E0930 17:42:59.238584 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.254618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.254663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.254674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.254692 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.254708 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.357911 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.358001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.358022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.358052 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.358067 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460499 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460510 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460794 4797 generic.go:334] "Generic (PLEG): container finished" podID="5be80c8f-41bd-41be-a86f-8c69e7655592" containerID="119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc" exitCode=0 Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.460849 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerDied","Data":"119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.491164 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.507820 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.525223 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.543833 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.565180 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.565506 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.565538 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.565609 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.565638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.565654 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.593154 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.613880 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5658q"] Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.614408 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.617089 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.617480 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.620181 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.620265 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.620920 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.634897 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.649219 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.657067 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtzv\" (UniqueName: \"kubernetes.io/projected/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-kube-api-access-rxtzv\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.657334 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-host\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.657414 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-serviceca\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.674427 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.674522 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.674549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.674577 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.674491 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.674597 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.692650 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.709581 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.733184 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.748791 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.758725 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxtzv\" (UniqueName: \"kubernetes.io/projected/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-kube-api-access-rxtzv\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.758810 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-host\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.758843 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-serviceca\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.759021 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-host\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.760109 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-serviceca\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.766776 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.780861 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.780938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.780967 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.781000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.781024 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.782157 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.790151 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxtzv\" (UniqueName: \"kubernetes.io/projected/ec2c2c98-8568-4b97-bc8c-13161ad0c7c5-kube-api-access-rxtzv\") pod \"node-ca-5658q\" (UID: \"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\") " pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.802328 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.826404 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.839542 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.862917 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.876497 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.884280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.884345 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.884360 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.884385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.884402 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.888135 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.902747 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.917347 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.939735 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.948743 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5658q" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.956980 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.972209 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.988389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.988452 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.988466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.988488 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.988504 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:42:59Z","lastTransitionTime":"2025-09-30T17:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:42:59 crc kubenswrapper[4797]: I0930 17:42:59.998005 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:42:59Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.012286 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.092312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.092388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.092400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.092421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.092458 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.195796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.195854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.195865 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.195884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.195896 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.256571 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.273330 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.292298 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.300297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.300354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.300369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.300389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.300402 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.305282 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.332496 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.356200 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.393860 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.414702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.414754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.414766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.414786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.414798 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.422792 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.448639 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.466579 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.470213 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" event={"ID":"5be80c8f-41bd-41be-a86f-8c69e7655592","Type":"ContainerStarted","Data":"7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.473838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.474571 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.474626 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.475954 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5658q" event={"ID":"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5","Type":"ContainerStarted","Data":"82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.475980 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5658q" event={"ID":"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5","Type":"ContainerStarted","Data":"2b3dae017c8d95cefca2b95656b15d5a5324cbb25557c59c7d6756a3ba73b02b"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.481240 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.498737 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.510164 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.510640 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.511179 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.517296 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.517339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.517358 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.517380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.517394 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.526375 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.539779 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.556772 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.571418 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.586141 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.615465 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.620035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.620162 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.620257 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.620342 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.620455 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.636080 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.654896 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.671106 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.691395 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.718123 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.722737 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.722926 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.723011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.723076 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.723137 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.735657 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.759279 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.776366 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.789229 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.803787 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.817772 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.825694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.825798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.825884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.825987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.826069 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.929571 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.929690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.929708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.930112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:00 crc kubenswrapper[4797]: I0930 17:43:00.930168 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:00Z","lastTransitionTime":"2025-09-30T17:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.016489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.016523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.016534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.016549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.016560 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.032364 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.038959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.039030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.039049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.039074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.039092 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.102367 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.109174 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.109220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.109233 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.109249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.109259 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.123083 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.127610 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.127645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.127654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.127668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.127678 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.142732 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.151094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.151186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.151218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.151252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.151276 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.172665 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.172841 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.176038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.176070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.176081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.176097 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.176109 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.238116 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.238273 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.238419 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.238614 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.238677 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:01 crc kubenswrapper[4797]: E0930 17:43:01.238953 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.278781 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.278853 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.278875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.278906 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.278930 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.382424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.382501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.382515 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.382535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.382552 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.480264 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.484584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.484664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.484716 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.484738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.484757 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.589221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.589304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.589322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.589349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.589370 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.691899 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.691968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.691981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.691997 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.692009 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.794332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.794403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.794414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.794429 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.794487 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.897936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.897999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.898020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.898047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:01 crc kubenswrapper[4797]: I0930 17:43:01.898068 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:01Z","lastTransitionTime":"2025-09-30T17:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.001945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.002009 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.002025 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.002048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.002063 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.104638 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.104685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.104701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.104724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.104740 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.207470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.207509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.207529 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.207549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.207562 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.313567 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.313641 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.313661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.313691 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.313711 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.417273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.417336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.417357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.417383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.417401 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.487361 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.521373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.521489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.521511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.521539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.521557 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.625250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.625300 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.625313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.625341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.625357 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.728654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.728696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.728707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.728725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.728736 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.831030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.831070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.831080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.831095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.831106 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.934154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.934193 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.934202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.934217 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:02 crc kubenswrapper[4797]: I0930 17:43:02.934228 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:02Z","lastTransitionTime":"2025-09-30T17:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.036601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.036673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.036686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.036701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.036713 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.139205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.139263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.139276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.139305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.139320 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.237816 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.237940 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:03 crc kubenswrapper[4797]: E0930 17:43:03.238060 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.237813 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:03 crc kubenswrapper[4797]: E0930 17:43:03.238294 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:03 crc kubenswrapper[4797]: E0930 17:43:03.238151 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.242372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.242490 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.242572 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.242661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.242722 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.345684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.345740 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.345750 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.345769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.345785 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.448382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.448423 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.448450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.448471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.448485 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.551138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.551176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.551186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.551201 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.551211 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.654176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.654249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.654274 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.654305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.654328 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.758000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.758063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.758086 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.758117 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.758319 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.862076 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.862147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.862164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.862197 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.862215 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.965833 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.965915 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.965939 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.965966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:03 crc kubenswrapper[4797]: I0930 17:43:03.965987 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:03Z","lastTransitionTime":"2025-09-30T17:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.068742 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.069389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.069430 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.069477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.069503 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.173302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.173352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.173369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.173394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.173416 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.276913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.276972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.276988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.277011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.277026 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.380636 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.380684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.380731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.380752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.380767 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.484149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.484220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.484238 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.484262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.484280 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.497404 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/0.log" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.501220 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d" exitCode=1 Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.501268 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.502099 4797 scope.go:117] "RemoveContainer" containerID="436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.530232 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.553238 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.570907 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.587333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.587404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.587414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.587447 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.587459 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.588209 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.607194 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.628304 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.645695 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.661558 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.680158 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.690475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.690506 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.690517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.690538 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.690551 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.705450 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:03Z\\\",\\\"message\\\":\\\"290 6095 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.441915 6095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:03.441933 6095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:03.442006 6095 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.442013 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:03.442060 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:43:03.442078 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:03.442084 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:03.442114 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:43:03.442116 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:03.442135 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:03.442174 6095 factory.go:656] Stopping watch factory\\\\nI0930 17:43:03.442183 6095 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.718902 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.735183 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.747601 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.761248 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.771413 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.793497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.793573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.793587 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.793612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.793625 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.897871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.897954 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.897975 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.898001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:04 crc kubenswrapper[4797]: I0930 17:43:04.898019 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:04Z","lastTransitionTime":"2025-09-30T17:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.000793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.000835 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.000847 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.000867 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.000879 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.103970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.104064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.104092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.104133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.104155 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.207518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.207580 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.207599 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.207625 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.207647 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.237334 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:05 crc kubenswrapper[4797]: E0930 17:43:05.237571 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.238134 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:05 crc kubenswrapper[4797]: E0930 17:43:05.238231 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.238302 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:05 crc kubenswrapper[4797]: E0930 17:43:05.238375 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.310884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.310937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.310952 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.310972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.310985 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.414658 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.414710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.414725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.414745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.414759 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.507522 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/0.log" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.510727 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.510906 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.516540 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.516580 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.516610 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.516630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.516642 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.533332 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.546749 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.555838 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.562610 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.579291 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.601621 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:03Z\\\",\\\"message\\\":\\\"290 6095 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.441915 6095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:03.441933 6095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:03.442006 6095 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.442013 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:03.442060 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:43:03.442078 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:03.442084 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:03.442114 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:43:03.442116 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:03.442135 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:03.442174 6095 factory.go:656] Stopping watch factory\\\\nI0930 17:43:03.442183 6095 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.616232 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.620624 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.620680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.620696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.620717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.620742 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.631464 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.645725 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.659835 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.672041 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.691543 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.713451 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.724585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.724628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.724643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.724663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.724678 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.730641 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.738661 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl"] Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.739184 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.741181 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.741749 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.745566 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.763410 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.783290 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.795257 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.805654 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.815888 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.828308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.828354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.828363 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.828380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.828391 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.832217 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.832405 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2k97\" (UniqueName: \"kubernetes.io/projected/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-kube-api-access-q2k97\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.832469 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.832529 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.838639 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:03Z\\\",\\\"message\\\":\\\"290 6095 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.441915 6095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:03.441933 6095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:03.442006 6095 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.442013 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:03.442060 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:43:03.442078 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:03.442084 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:03.442114 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:43:03.442116 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:03.442135 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:03.442174 6095 factory.go:656] Stopping watch factory\\\\nI0930 17:43:03.442183 6095 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.853196 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.869127 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.894994 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.907190 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.923150 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.930758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.930826 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.930844 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.930873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.930893 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:05Z","lastTransitionTime":"2025-09-30T17:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.933312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.933361 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.933392 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.933456 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2k97\" (UniqueName: \"kubernetes.io/projected/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-kube-api-access-q2k97\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.934395 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.934794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.941530 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.944648 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.953792 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2k97\" (UniqueName: \"kubernetes.io/projected/1d03b039-cc5b-4e6d-ad02-c41e8b60004f-kube-api-access-q2k97\") pod \"ovnkube-control-plane-749d76644c-xq2hl\" (UID: \"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.963332 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:05 crc kubenswrapper[4797]: I0930 17:43:05.981408 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.004638 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.020965 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.034340 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.034418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.034467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.034496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.034515 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.041324 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.055562 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" Sep 30 17:43:06 crc kubenswrapper[4797]: W0930 17:43:06.072388 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d03b039_cc5b_4e6d_ad02_c41e8b60004f.slice/crio-cf4dc122cafaac0ed39a72132951198af12d2734feb4a335f0b110bd5b94f1e1 WatchSource:0}: Error finding container cf4dc122cafaac0ed39a72132951198af12d2734feb4a335f0b110bd5b94f1e1: Status 404 returned error can't find the container with id cf4dc122cafaac0ed39a72132951198af12d2734feb4a335f0b110bd5b94f1e1 Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.137846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.137894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.137905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.137932 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.137947 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.241348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.241413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.241479 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.241509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.241528 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.345043 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.345093 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.345108 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.345131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.345147 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.451392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.451492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.451512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.451537 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.451561 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.500540 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rx9f5"] Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.501216 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.501300 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.522337 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/1.log" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.523304 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/0.log" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.525096 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.527296 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2" exitCode=1 Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.527391 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.527500 4797 scope.go:117] "RemoveContainer" containerID="436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.528624 4797 scope.go:117] "RemoveContainer" containerID="021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2" Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.528927 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.531541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" event={"ID":"1d03b039-cc5b-4e6d-ad02-c41e8b60004f","Type":"ContainerStarted","Data":"f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.531897 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" event={"ID":"1d03b039-cc5b-4e6d-ad02-c41e8b60004f","Type":"ContainerStarted","Data":"25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.532057 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" event={"ID":"1d03b039-cc5b-4e6d-ad02-c41e8b60004f","Type":"ContainerStarted","Data":"cf4dc122cafaac0ed39a72132951198af12d2734feb4a335f0b110bd5b94f1e1"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.539576 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.554805 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.554879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.554892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.554914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.554932 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.562879 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.577452 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.598527 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.612960 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.628662 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.641067 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.641196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whrqg\" (UniqueName: \"kubernetes.io/projected/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-kube-api-access-whrqg\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.644206 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.657636 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.657702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.657721 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.657748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.657767 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.664143 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:03Z\\\",\\\"message\\\":\\\"290 6095 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.441915 6095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:03.441933 6095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:03.442006 6095 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.442013 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:03.442060 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:43:03.442078 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:03.442084 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:03.442114 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:43:03.442116 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:03.442135 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:03.442174 6095 factory.go:656] Stopping watch factory\\\\nI0930 17:43:03.442183 6095 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.680208 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.705702 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.729464 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.742468 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whrqg\" (UniqueName: \"kubernetes.io/projected/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-kube-api-access-whrqg\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.742541 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.742625 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.742693 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.742752 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:43:07.242735497 +0000 UTC m=+37.765234735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.754296 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.760210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.760262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.760281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.760305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.760324 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.765516 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whrqg\" (UniqueName: \"kubernetes.io/projected/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-kube-api-access-whrqg\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.767416 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.781344 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.796880 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.810060 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.822036 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.835802 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.847610 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.863098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.863152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.863170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.863194 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.863209 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.864043 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.879056 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.899857 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.915624 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.932346 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.944700 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.944925 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.944992 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:43:22.944945166 +0000 UTC m=+53.467444424 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.945066 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:43:06 crc kubenswrapper[4797]: E0930 17:43:06.945161 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:22.945137891 +0000 UTC m=+53.467637179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.951365 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.966461 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.966509 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.966520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.966539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.966550 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:06Z","lastTransitionTime":"2025-09-30T17:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.971010 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:03Z\\\",\\\"message\\\":\\\"290 6095 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.441915 6095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:03.441933 6095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:03.442006 6095 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.442013 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:03.442060 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:43:03.442078 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:03.442084 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:03.442114 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:43:03.442116 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:03.442135 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:03.442174 6095 factory.go:656] Stopping watch factory\\\\nI0930 17:43:03.442183 6095 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:06 crc kubenswrapper[4797]: I0930 17:43:06.987193 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.000136 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.014901 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.027918 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.045752 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.045832 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.045855 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046012 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046031 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046042 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046089 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046143 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046162 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046108 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:23.046091432 +0000 UTC m=+53.568590670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046283 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:23.046256997 +0000 UTC m=+53.568756245 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046358 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.046403 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:23.046394101 +0000 UTC m=+53.568893349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.062605 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.069195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.069249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.069262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.069282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.069302 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.089914 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.173139 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.173176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.173185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.173200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.173211 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.237039 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.237095 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.237198 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.237311 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.237566 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.237598 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.247288 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.247529 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: E0930 17:43:07.247662 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:43:08.247634301 +0000 UTC m=+38.770133719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.276389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.276507 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.276525 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.276551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.276573 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.380232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.380380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.380410 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.380477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.380508 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.484297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.484350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.484362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.484383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.484395 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.537998 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/1.log" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.587166 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.587240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.587263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.587294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.587313 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.690623 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.691162 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.691180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.691200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.691245 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.795284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.795382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.795400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.795426 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.795508 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.899534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.899589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.899606 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.899631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:07 crc kubenswrapper[4797]: I0930 17:43:07.899649 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:07Z","lastTransitionTime":"2025-09-30T17:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.002047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.002094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.002107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.002126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.002139 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.105491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.105755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.105848 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.105920 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.105977 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.209597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.209992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.210118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.210224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.210320 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.238259 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:08 crc kubenswrapper[4797]: E0930 17:43:08.238504 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.261373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:08 crc kubenswrapper[4797]: E0930 17:43:08.261830 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:08 crc kubenswrapper[4797]: E0930 17:43:08.262012 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:43:10.261990327 +0000 UTC m=+40.784489565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.313343 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.313407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.313425 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.313477 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.313494 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.416075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.416127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.416142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.416161 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.416174 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.519699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.519748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.519757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.519774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.519787 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.623708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.624562 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.624618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.624646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.624665 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.727578 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.727654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.727672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.727707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.727726 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.831885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.831973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.831993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.832019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.832069 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.934590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.934646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.934659 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.934680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:08 crc kubenswrapper[4797]: I0930 17:43:08.934692 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:08Z","lastTransitionTime":"2025-09-30T17:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.037770 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.037826 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.037840 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.037860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.037881 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.141230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.141289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.141303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.141319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.141330 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.237834 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.237908 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.237862 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:09 crc kubenswrapper[4797]: E0930 17:43:09.238087 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:09 crc kubenswrapper[4797]: E0930 17:43:09.238259 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:09 crc kubenswrapper[4797]: E0930 17:43:09.238485 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.245748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.245805 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.245823 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.245850 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.245869 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.349028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.349082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.349093 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.349113 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.349123 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.452099 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.452174 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.452194 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.452221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.452239 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.555014 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.555080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.555098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.555123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.555139 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.658859 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.658902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.658913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.658934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.658947 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.768806 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.768853 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.768934 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.768953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.768965 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.872561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.872663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.872680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.872708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.872726 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.975223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.975602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.975680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.975757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:09 crc kubenswrapper[4797]: I0930 17:43:09.975894 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:09Z","lastTransitionTime":"2025-09-30T17:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.080124 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.080204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.080240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.080261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.080274 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.183605 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.183679 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.183703 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.183737 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.183762 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.237355 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:10 crc kubenswrapper[4797]: E0930 17:43:10.238620 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.257724 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.280840 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.286314 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:10 crc kubenswrapper[4797]: E0930 17:43:10.286627 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:10 crc kubenswrapper[4797]: E0930 17:43:10.286716 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:43:14.286689637 +0000 UTC m=+44.809188905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.287112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.287519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.287547 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.287582 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.287602 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.308902 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.328134 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.358755 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.379103 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.390019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.390065 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.390082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.390106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.390124 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.415025 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.435650 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.458592 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.481767 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.493632 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.493701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.493720 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.493748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.493766 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.517572 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://436e8f24742bda97b24fdee38e0992c95cebd53ad041c8122d208c2e204e953d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:03Z\\\",\\\"message\\\":\\\"290 6095 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.441915 6095 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:03.441933 6095 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:03.442006 6095 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:03.442013 6095 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:03.442060 6095 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 17:43:03.442078 6095 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:03.442084 6095 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:03.442114 6095 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:43:03.442116 6095 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:03.442135 6095 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:03.442174 6095 factory.go:656] Stopping watch factory\\\\nI0930 17:43:03.442183 6095 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.537187 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.564616 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.585869 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.597372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.597672 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.597682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.597699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.597711 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.603086 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.622341 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.640921 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.700748 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.700824 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.700848 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.700927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.700946 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.804684 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.804788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.804817 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.804850 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.804877 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.908500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.908601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.908629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.908667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:10 crc kubenswrapper[4797]: I0930 17:43:10.908693 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:10Z","lastTransitionTime":"2025-09-30T17:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.011955 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.012026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.012044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.012077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.012094 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.114901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.114965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.114979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.115004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.115037 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.217939 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.217990 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.218000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.218017 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.218029 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.237786 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.237857 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.237786 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.237965 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.238085 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.238226 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.304908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.304996 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.305026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.305056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.305077 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.326564 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.332483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.332554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.332572 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.332600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.332620 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.351287 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.357392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.357496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.357515 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.357546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.357567 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.378859 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.383649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.383712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.383728 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.383774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.383788 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.401273 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.405488 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.405550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.405563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.405579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.405590 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.421599 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:11 crc kubenswrapper[4797]: E0930 17:43:11.421781 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.423792 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.423878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.423901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.423933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.423959 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.527352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.527415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.527471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.527500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.527517 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.630406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.630496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.630511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.630534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.630554 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.733347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.733419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.733478 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.733503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.733521 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.836321 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.836383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.836401 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.836424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.836476 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.940798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.940878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.940899 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.940928 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:11 crc kubenswrapper[4797]: I0930 17:43:11.940950 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:11Z","lastTransitionTime":"2025-09-30T17:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.043966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.044028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.044046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.044064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.044077 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.147261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.147341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.147357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.147541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.147613 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.237865 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:12 crc kubenswrapper[4797]: E0930 17:43:12.238124 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.251385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.251485 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.251504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.251532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.251551 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.355214 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.355266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.355277 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.355295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.355308 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.458529 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.458605 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.458623 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.458651 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.458672 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.562523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.562578 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.562587 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.562604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.562618 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.666519 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.666608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.666628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.667105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.667385 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.771080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.771155 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.771179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.771225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.771255 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.875069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.875149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.875172 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.875219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.875275 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.980214 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.980303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.980325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.980355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:12 crc kubenswrapper[4797]: I0930 17:43:12.980376 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:12Z","lastTransitionTime":"2025-09-30T17:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.084175 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.084266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.084284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.084308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.084325 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.187853 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.187931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.187968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.187993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.188006 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.237223 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.237289 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:13 crc kubenswrapper[4797]: E0930 17:43:13.237425 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:13 crc kubenswrapper[4797]: E0930 17:43:13.237572 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.237627 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:13 crc kubenswrapper[4797]: E0930 17:43:13.237705 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.291900 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.291965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.291985 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.292011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.292032 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.396404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.396526 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.396556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.396590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.396610 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.500551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.500619 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.500643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.500676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.500701 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.604922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.604993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.605019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.605054 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.605077 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.708127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.708180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.708198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.708225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.708242 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.811882 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.811949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.811968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.811992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.812012 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.915361 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.915468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.915489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.915518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:13 crc kubenswrapper[4797]: I0930 17:43:13.915541 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:13Z","lastTransitionTime":"2025-09-30T17:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.019089 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.019172 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.019209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.019241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.019259 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.122809 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.122871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.122891 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.122917 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.122934 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.227550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.227615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.227632 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.227657 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.227677 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.238171 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:14 crc kubenswrapper[4797]: E0930 17:43:14.238335 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.331560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.331644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.331670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.331701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.331719 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.339222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:14 crc kubenswrapper[4797]: E0930 17:43:14.339475 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:14 crc kubenswrapper[4797]: E0930 17:43:14.339565 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:43:22.339540862 +0000 UTC m=+52.862040130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.434873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.434941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.434958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.435042 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.435062 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.538520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.538645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.538666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.538696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.538735 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.641875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.641924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.641937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.641956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.641966 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.744670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.744747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.744771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.744805 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.744831 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.848205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.848261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.848273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.848290 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.848303 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.952300 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.952348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.952359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.952376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:14 crc kubenswrapper[4797]: I0930 17:43:14.952388 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:14Z","lastTransitionTime":"2025-09-30T17:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.055602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.055648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.055689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.055709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.055723 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.158718 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.159338 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.159916 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.160219 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.160472 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.237821 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.237845 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.237884 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:15 crc kubenswrapper[4797]: E0930 17:43:15.238770 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:15 crc kubenswrapper[4797]: E0930 17:43:15.238810 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:15 crc kubenswrapper[4797]: E0930 17:43:15.238849 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.263533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.263559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.263569 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.263582 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.263592 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.366774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.366805 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.366830 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.366845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.366855 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.470494 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.470553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.470573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.470598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.470615 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.573719 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.573775 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.573788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.573812 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.573827 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.677164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.677640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.679248 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.679600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.680463 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.785120 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.785179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.785198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.785223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.785243 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.899283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.899329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.899340 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.899355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:15 crc kubenswrapper[4797]: I0930 17:43:15.899367 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:15Z","lastTransitionTime":"2025-09-30T17:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.002681 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.002741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.002754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.002776 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.002789 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.106163 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.106213 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.106224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.106241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.106256 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.209906 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.210005 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.210557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.210647 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.210919 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.237346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:16 crc kubenswrapper[4797]: E0930 17:43:16.237608 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.313973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.314063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.314088 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.314122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.314146 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.417192 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.417256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.417270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.417292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.417306 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.520825 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.520885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.520899 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.520917 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.520929 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.623940 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.624044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.624065 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.624131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.624153 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.728480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.728548 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.728565 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.728592 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.728612 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.831329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.831411 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.831429 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.831483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.831503 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.934069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.934127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.934144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.934173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:16 crc kubenswrapper[4797]: I0930 17:43:16.934195 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:16Z","lastTransitionTime":"2025-09-30T17:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.037458 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.037506 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.037543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.037564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.037578 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.140200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.140244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.140258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.140279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.140292 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.237423 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.237478 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.237472 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:17 crc kubenswrapper[4797]: E0930 17:43:17.238281 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:17 crc kubenswrapper[4797]: E0930 17:43:17.237943 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:17 crc kubenswrapper[4797]: E0930 17:43:17.243702 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.243856 4797 scope.go:117] "RemoveContainer" containerID="021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.245299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.245343 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.245355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.245370 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.245382 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.263930 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.279397 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.299031 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.312672 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.330388 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.348859 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.348892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.348902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.348918 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.348929 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.355069 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.370111 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.389866 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.411871 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.439859 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.455866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.455928 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.455943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.455967 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.455982 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.459289 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.478787 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.494357 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.507337 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.519866 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.538997 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.558453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.558502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.558515 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.558534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.558548 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.562105 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.593795 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/1.log" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.597930 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.598129 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.615361 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.633083 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.648402 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.661755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.661788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.661798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.661829 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.661841 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.663237 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.677612 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.695192 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.717861 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.739100 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.760556 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.764792 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.764841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.764854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.764872 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.764885 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.790471 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.810633 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.838092 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.859537 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.868760 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.868857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.868877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.868904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.868922 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.874481 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.896017 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.910264 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.924016 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.972122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.972167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.972184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.972206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:17 crc kubenswrapper[4797]: I0930 17:43:17.972221 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:17Z","lastTransitionTime":"2025-09-30T17:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.075295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.075375 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.075395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.075429 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.075477 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.178851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.178890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.178898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.178915 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.178924 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.237254 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:18 crc kubenswrapper[4797]: E0930 17:43:18.237674 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.281833 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.281890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.281905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.281930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.281944 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.385518 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.385565 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.385579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.385600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.385615 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.489154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.489196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.489204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.489223 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.489235 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.592599 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.592664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.592683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.592712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.592729 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.605114 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/2.log" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.606023 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/1.log" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.610249 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3" exitCode=1 Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.610340 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.610752 4797 scope.go:117] "RemoveContainer" containerID="021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.615308 4797 scope.go:117] "RemoveContainer" containerID="f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3" Sep 30 17:43:18 crc kubenswrapper[4797]: E0930 17:43:18.615992 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.631581 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.656713 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.675154 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.691598 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.696104 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.696273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.696357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.696479 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.696574 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.708817 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.733860 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.749212 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.763616 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.778985 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.791593 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.804855 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.805218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.806306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.806391 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.806407 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.808233 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.829161 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.849781 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.865765 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.882087 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.898597 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.908984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.909014 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.909024 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.909039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.909049 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:18Z","lastTransitionTime":"2025-09-30T17:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:18 crc kubenswrapper[4797]: I0930 17:43:18.914392 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.013256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.013309 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.013325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.013347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.013362 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.117687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.117787 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.117810 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.117845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.117865 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.220898 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.220974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.220987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.221016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.221030 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.237096 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.237174 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.237110 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:19 crc kubenswrapper[4797]: E0930 17:43:19.237324 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:19 crc kubenswrapper[4797]: E0930 17:43:19.237427 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:19 crc kubenswrapper[4797]: E0930 17:43:19.237531 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.323080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.323119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.323128 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.323143 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.323152 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.328347 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.339356 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.357472 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.373336 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.390864 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.405746 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.426198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.426253 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.426263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.426294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.426309 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.428079 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.443363 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.463314 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.482090 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.502925 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.520300 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.529601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.529661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.529674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.529693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.529709 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.538894 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.557345 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.574797 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.590930 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.608417 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.617675 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/2.log" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.633501 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.634035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.634067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.634092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.634117 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.634134 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.650297 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.737169 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.737206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.737216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.737230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.737240 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.840727 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.840804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.840845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.840921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.840952 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.944615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.944692 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.944709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.944736 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:19 crc kubenswrapper[4797]: I0930 17:43:19.944756 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:19Z","lastTransitionTime":"2025-09-30T17:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.049064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.049139 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.049156 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.049185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.049204 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.152964 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.153011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.153023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.153045 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.153058 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.238408 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:20 crc kubenswrapper[4797]: E0930 17:43:20.238635 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.256734 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.256829 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.256852 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.256886 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.256904 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.273899 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.295046 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.316153 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.342807 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.359950 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.360015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.360034 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.360058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.360078 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.367146 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021b84f9007d392f6bd98bfdabeaca29a1fc5342bde1d8f2bb9e063d8b1291b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"message\\\":\\\"go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:43:05.718344 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:05.718395 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:05.718419 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718730 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718922 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:43:05.718967 6272 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:43:05.720971 6272 factory.go:656] Stopping watch factory\\\\nI0930 17:43:05.725038 6272 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:43:05.725167 6272 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:43:05.725306 6272 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:05.725367 6272 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:05.725531 6272 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.385093 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.408580 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.428389 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.443473 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.461665 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.463255 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.463298 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.463310 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.463327 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.463340 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.475792 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.489264 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.511268 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.571242 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.575339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.575397 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.575411 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.575694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.575723 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.597195 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.621422 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.636179 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.647470 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:20Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.678860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.678897 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.678907 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.678926 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.678937 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.781539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.781589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.781598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.781616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.781627 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.885659 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.885744 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.885763 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.885793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.885813 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.989181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.989226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.989235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.989251 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:20 crc kubenswrapper[4797]: I0930 17:43:20.989262 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:20Z","lastTransitionTime":"2025-09-30T17:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.092892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.092945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.092956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.092975 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.092991 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.196208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.196276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.196295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.196323 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.196342 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.237879 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.237992 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.238047 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.237882 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.238191 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.238320 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.299460 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.299501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.299514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.299532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.299544 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.403003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.403075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.403092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.403118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.403130 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.446767 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.446839 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.446858 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.446886 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.446910 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.469505 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.474904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.474979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.474999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.475024 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.475043 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.493829 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.499269 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.499357 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.499511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.499561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.499597 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.517604 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.523666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.523706 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.523720 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.523741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.523760 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.544600 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.549746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.549810 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.549830 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.549897 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.549918 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.564849 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:21 crc kubenswrapper[4797]: E0930 17:43:21.564961 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.566395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.566453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.566466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.566485 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.566500 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.676403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.676497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.676517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.676543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.676563 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.779921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.779968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.779976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.779993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.780005 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.884406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.884550 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.884578 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.884613 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.884644 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.988512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.988562 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.988580 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.988604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:21 crc kubenswrapper[4797]: I0930 17:43:21.988621 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:21Z","lastTransitionTime":"2025-09-30T17:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.091739 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.091795 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.091815 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.091843 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.091861 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.195888 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.195937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.195954 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.195980 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.195999 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.237504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:22 crc kubenswrapper[4797]: E0930 17:43:22.237754 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.298804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.298895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.298924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.298961 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.298981 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.402516 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.402624 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.402651 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.402686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.402712 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.434985 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:22 crc kubenswrapper[4797]: E0930 17:43:22.435215 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:22 crc kubenswrapper[4797]: E0930 17:43:22.435370 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:43:38.435335653 +0000 UTC m=+68.957834921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.506088 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.506146 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.506158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.506175 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.506190 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.609008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.609056 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.609067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.609084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.609095 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.712251 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.712318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.712335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.712365 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.712384 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.815992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.816092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.816123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.816158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.816181 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.919816 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.919890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.919908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.919936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:22 crc kubenswrapper[4797]: I0930 17:43:22.919954 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:22Z","lastTransitionTime":"2025-09-30T17:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.023765 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.023849 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.023885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.023930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.023955 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.041359 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.041590 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:43:55.041553979 +0000 UTC m=+85.564053257 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.041729 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.041887 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.041964 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:55.04194459 +0000 UTC m=+85.564443858 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.127747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.127817 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.127842 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.127875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.127901 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.143536 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.143617 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.143690 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.143825 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.143874 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.143896 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.143937 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.143998 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:55.143967613 +0000 UTC m=+85.666466881 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.144003 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.144043 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:55.144014994 +0000 UTC m=+85.666514262 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.144053 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.144079 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.144188 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:43:55.144158318 +0000 UTC m=+85.666657596 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.231692 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.231770 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.231797 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.231833 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.231857 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.238030 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.238060 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.238094 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.238421 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.238689 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.238866 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.336240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.336640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.336897 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.337220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.337537 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.440976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.441348 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.441523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.441656 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.441816 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.545580 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.545644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.545661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.545687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.545704 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.625845 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.627658 4797 scope.go:117] "RemoveContainer" containerID="f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3" Sep 30 17:43:23 crc kubenswrapper[4797]: E0930 17:43:23.628003 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.648697 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.648754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.648774 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.648800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.648819 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.651524 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.675667 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.697561 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.715667 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.741544 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.753160 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.753211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.753228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.753254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.753272 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.766156 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.791669 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.809416 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.836240 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.856114 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.856185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.856208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.856234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.856253 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.857590 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.879235 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.892221 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.905666 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.919998 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.938597 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.954143 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.959003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.959052 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.959068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.959091 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.959107 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:23Z","lastTransitionTime":"2025-09-30T17:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.973293 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:23 crc kubenswrapper[4797]: I0930 17:43:23.995931 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:23Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.062860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.063504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.063547 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.063582 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.063608 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.166687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.167047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.167055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.167073 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.167084 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.237722 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:24 crc kubenswrapper[4797]: E0930 17:43:24.237990 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.269930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.270200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.270336 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.270553 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.270684 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.373801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.373857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.373870 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.373890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.373903 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.477307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.477353 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.477364 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.477384 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.477397 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.580641 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.580696 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.580712 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.580735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.580752 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.683970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.684040 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.684058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.684086 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.684106 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.787173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.787245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.787262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.787289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.787307 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.891560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.891660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.891678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.891721 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.891738 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.995412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.995512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.995534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.995565 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:24 crc kubenswrapper[4797]: I0930 17:43:24.995586 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:24Z","lastTransitionTime":"2025-09-30T17:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.098559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.098625 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.098646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.098671 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.098689 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.201307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.201386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.201408 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.201480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.201509 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.237813 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.237870 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.237838 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:25 crc kubenswrapper[4797]: E0930 17:43:25.238060 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:25 crc kubenswrapper[4797]: E0930 17:43:25.238179 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:25 crc kubenswrapper[4797]: E0930 17:43:25.238369 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.305006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.305418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.305628 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.305833 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.305963 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.409630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.409722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.409747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.409773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.409791 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.513235 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.513322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.513346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.513378 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.513403 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.616135 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.616181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.616202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.616222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.616238 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.719872 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.719939 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.719956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.719982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.720001 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.822956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.823022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.823039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.823068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.823089 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.925908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.925950 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.925962 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.925981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:25 crc kubenswrapper[4797]: I0930 17:43:25.925994 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:25Z","lastTransitionTime":"2025-09-30T17:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.029231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.029295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.029309 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.029329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.029344 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.132523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.132579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.132591 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.132615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.132660 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.236077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.236141 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.236157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.236183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.236211 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.237149 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:26 crc kubenswrapper[4797]: E0930 17:43:26.237326 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.339212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.339276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.339293 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.339318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.339335 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.443287 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.443389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.443410 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.443484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.443522 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.546611 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.546675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.546698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.546724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.546748 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.649755 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.649827 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.649845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.649871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.649888 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.752747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.753096 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.753184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.753299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.753386 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.857152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.857208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.857229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.857254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.857271 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.959921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.959996 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.960019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.960050 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:26 crc kubenswrapper[4797]: I0930 17:43:26.960072 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:26Z","lastTransitionTime":"2025-09-30T17:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.064109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.064180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.064199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.064226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.064243 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.167915 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.168000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.168013 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.168032 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.168048 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.238088 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.238214 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:27 crc kubenswrapper[4797]: E0930 17:43:27.238289 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.238093 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:27 crc kubenswrapper[4797]: E0930 17:43:27.238631 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:27 crc kubenswrapper[4797]: E0930 17:43:27.238416 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.272217 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.272282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.272294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.272311 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.272322 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.375359 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.375424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.375469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.375497 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.375517 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.478807 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.478976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.478994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.479021 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.479040 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.582976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.583063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.583083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.583112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.583132 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.686721 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.686780 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.686792 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.686814 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.686826 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.790060 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.790134 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.790152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.790178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.790196 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.894207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.894275 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.894299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.894333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.894356 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.998314 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.998416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.998498 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.998526 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:27 crc kubenswrapper[4797]: I0930 17:43:27.998548 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:27Z","lastTransitionTime":"2025-09-30T17:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.129999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.130083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.130102 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.130135 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.130154 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.233285 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.233355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.233380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.233412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.233469 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.238032 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:28 crc kubenswrapper[4797]: E0930 17:43:28.238162 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.336966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.337030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.337045 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.337071 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.337217 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.440800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.440853 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.440870 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.440895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.440913 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.544886 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.544957 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.544981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.545010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.545031 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.648811 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.649023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.649046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.649075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.649092 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.752496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.752580 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.752600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.752627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.752646 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.855960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.856018 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.856034 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.856057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.856075 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.959466 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.959524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.959544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.959573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:28 crc kubenswrapper[4797]: I0930 17:43:28.959591 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:28Z","lastTransitionTime":"2025-09-30T17:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.062863 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.062950 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.062974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.063038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.063067 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.166413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.166492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.166507 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.166528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.166543 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.237792 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.237799 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.237799 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:29 crc kubenswrapper[4797]: E0930 17:43:29.237999 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:29 crc kubenswrapper[4797]: E0930 17:43:29.238247 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:29 crc kubenswrapper[4797]: E0930 17:43:29.238371 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.270531 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.270581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.270594 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.270612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.270624 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.374007 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.374082 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.374101 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.374128 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.374147 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.477668 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.477731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.477746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.477772 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.477792 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.580694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.580723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.580731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.580747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.580757 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.683793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.683834 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.683844 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.683863 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.683875 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.787151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.787215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.787239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.787274 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.787296 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.891103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.891184 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.891205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.891232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.891299 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.994281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.994366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.994380 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.994401 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:29 crc kubenswrapper[4797]: I0930 17:43:29.994811 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:29Z","lastTransitionTime":"2025-09-30T17:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.097938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.098000 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.098014 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.098039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.098058 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.200860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.200993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.201013 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.201093 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.201149 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.237265 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:30 crc kubenswrapper[4797]: E0930 17:43:30.237540 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.261978 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.283092 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.301621 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.303894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.303964 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.303977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.303995 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.304009 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.321089 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.339878 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.360745 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.379201 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.399417 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.406823 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.406910 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.406935 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.406994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.407019 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.419172 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.433936 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.453152 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.465523 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.490255 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.507262 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.512213 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.512264 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.512281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.512304 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.512319 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.524466 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.543190 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.565483 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.579941 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:30Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.615279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.615364 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.615394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.615426 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.615488 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.719399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.719508 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.719528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.719561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.719582 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.822467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.822541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.822561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.822587 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.822607 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.925930 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.926013 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.926037 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.926076 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:30 crc kubenswrapper[4797]: I0930 17:43:30.926103 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:30Z","lastTransitionTime":"2025-09-30T17:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.029640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.029733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.029757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.029794 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.029818 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.132600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.132780 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.132822 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.132914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.132991 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237266 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:31 crc kubenswrapper[4797]: E0930 17:43:31.237461 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237244 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237719 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237756 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237767 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: E0930 17:43:31.237771 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.237798 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: E0930 17:43:31.238064 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.342369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.342424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.342450 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.342474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.342489 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.446963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.447042 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.447065 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.447100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.447128 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.550085 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.550149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.550168 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.550199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.550219 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.653066 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.653115 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.653127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.653145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.653155 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.756268 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.756331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.756350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.756376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.756406 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.858843 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.858893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.858903 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.858949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.858961 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.909682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.909757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.909780 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.909807 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.909827 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: E0930 17:43:31.930873 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.936277 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.936338 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.936355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.936383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.936402 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: E0930 17:43:31.958348 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.964908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.965259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.965455 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.965600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.965699 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:31 crc kubenswrapper[4797]: E0930 17:43:31.980625 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.985420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.985559 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.985631 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.985727 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:31 crc kubenswrapper[4797]: I0930 17:43:31.985815 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:31Z","lastTransitionTime":"2025-09-30T17:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: E0930 17:43:32.001453 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.006251 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.006294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.006306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.006324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.006337 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: E0930 17:43:32.028535 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:32 crc kubenswrapper[4797]: E0930 17:43:32.028791 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.031265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.031335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.031350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.031371 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.031405 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.135835 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.135892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.135912 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.135943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.135960 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.237355 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:32 crc kubenswrapper[4797]: E0930 17:43:32.237705 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.240405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.240505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.240535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.240627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.240676 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.343141 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.343179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.343191 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.343208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.343222 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.446163 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.446282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.446299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.446322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.446340 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.549840 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.549921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.549946 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.549974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.549991 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.653198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.653280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.653295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.653320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.653335 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.756725 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.757189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.757424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.757667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.757929 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.860322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.860640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.860811 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.860979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.861120 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.964364 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.964463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.964483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.964520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:32 crc kubenswrapper[4797]: I0930 17:43:32.964558 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:32Z","lastTransitionTime":"2025-09-30T17:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.067551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.067894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.068050 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.068195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.068334 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.171856 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.171926 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.171945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.171974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.171994 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.237426 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.237520 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.237470 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:33 crc kubenswrapper[4797]: E0930 17:43:33.237703 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:33 crc kubenswrapper[4797]: E0930 17:43:33.237834 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:33 crc kubenswrapper[4797]: E0930 17:43:33.237921 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.291084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.291158 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.291178 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.291204 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.291223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.394629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.394686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.394698 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.394719 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.394731 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.497793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.498185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.498322 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.498492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.498631 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.601589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.601662 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.601680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.601707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.601727 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.704395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.704492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.704512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.704539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.704555 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.807156 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.807211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.807228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.807252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.807280 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.910570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.910688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.910707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.910738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:33 crc kubenswrapper[4797]: I0930 17:43:33.910756 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:33Z","lastTransitionTime":"2025-09-30T17:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.014649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.015005 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.015196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.015413 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.015647 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.119062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.119556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.119707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.119879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.120065 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.222852 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.223116 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.223290 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.223416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.223557 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.237613 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:34 crc kubenswrapper[4797]: E0930 17:43:34.237889 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.326925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.326982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.326994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.327013 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.327023 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.429526 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.429573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.429582 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.429601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.429611 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.532505 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.532540 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.532549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.532564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.532575 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.635561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.635612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.635622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.635641 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.635651 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.738392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.738453 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.738464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.738482 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.738495 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.841969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.842029 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.842040 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.842065 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.842079 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.945989 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.946044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.946054 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.946072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:34 crc kubenswrapper[4797]: I0930 17:43:34.946082 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:34Z","lastTransitionTime":"2025-09-30T17:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.050078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.050140 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.050157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.050196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.050219 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.153949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.153992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.154003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.154023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.154039 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.237292 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.237364 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.237370 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:35 crc kubenswrapper[4797]: E0930 17:43:35.237577 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:35 crc kubenswrapper[4797]: E0930 17:43:35.237814 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:35 crc kubenswrapper[4797]: E0930 17:43:35.238005 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.257128 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.257171 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.257187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.257205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.257216 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.360806 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.360862 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.360873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.360891 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.360902 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.464074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.464125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.464136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.464153 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.464164 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.566825 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.566913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.566938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.567015 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.567036 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.670932 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.671002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.671020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.671048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.671067 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.773504 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.773551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.773562 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.773579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.773590 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.876780 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.876929 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.876948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.876979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.876997 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.980724 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.980804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.980822 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.980852 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:35 crc kubenswrapper[4797]: I0930 17:43:35.980874 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:35Z","lastTransitionTime":"2025-09-30T17:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.084126 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.084165 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.084173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.084189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.084198 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.187734 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.187791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.187804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.187827 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.187842 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.237818 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:36 crc kubenswrapper[4797]: E0930 17:43:36.238021 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.290891 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.290949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.290959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.290977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.290988 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.393523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.393571 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.393581 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.393601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.393612 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.496721 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.497187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.497202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.497224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.497237 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.600073 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.600123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.600132 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.600149 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.600158 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.703392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.703467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.703480 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.703501 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.703515 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.806256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.806318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.806333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.806356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.806372 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.910387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.910475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.910489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.910511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:36 crc kubenswrapper[4797]: I0930 17:43:36.910525 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:36Z","lastTransitionTime":"2025-09-30T17:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.013981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.014028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.014038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.014059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.014070 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.116176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.116209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.116218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.116232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.116241 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.219683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.219752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.219771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.219798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.219819 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.238085 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.238259 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.238123 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:37 crc kubenswrapper[4797]: E0930 17:43:37.238376 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:37 crc kubenswrapper[4797]: E0930 17:43:37.238529 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:37 crc kubenswrapper[4797]: E0930 17:43:37.238725 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.323489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.323525 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.323534 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.323549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.323559 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.426614 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.426645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.426654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.426671 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.426682 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.530302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.530355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.530366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.530385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.530397 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.633937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.633999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.634008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.634026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.634039 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.737354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.737463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.737478 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.737498 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.737510 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.840932 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.840996 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.841010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.841031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.841045 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.944090 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.944128 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.944136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.944152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:37 crc kubenswrapper[4797]: I0930 17:43:37.944164 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:37Z","lastTransitionTime":"2025-09-30T17:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.046621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.046657 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.046665 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.046680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.046689 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.149474 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.149517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.149528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.149544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.149556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.237623 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:38 crc kubenswrapper[4797]: E0930 17:43:38.237795 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.238513 4797 scope.go:117] "RemoveContainer" containerID="f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3" Sep 30 17:43:38 crc kubenswrapper[4797]: E0930 17:43:38.238838 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.251695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.251751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.251762 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.251786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.251798 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.354877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.354922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.354931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.354948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.354960 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.457524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.457570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.457579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.457598 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.457611 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.525289 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:38 crc kubenswrapper[4797]: E0930 17:43:38.525563 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:38 crc kubenswrapper[4797]: E0930 17:43:38.525766 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:44:10.525730177 +0000 UTC m=+101.048229445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.560597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.560648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.560665 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.560687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.560705 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.664366 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.664446 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.664458 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.664476 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.664487 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.767470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.767511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.767524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.767546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.767558 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.869766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.869795 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.869805 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.869820 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.869829 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.972521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.972896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.973057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.973227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:38 crc kubenswrapper[4797]: I0930 17:43:38.973365 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:38Z","lastTransitionTime":"2025-09-30T17:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.076314 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.076374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.076383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.076399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.076410 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.180172 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.180221 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.180238 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.180262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.180279 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.237693 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.237710 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.237814 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:39 crc kubenswrapper[4797]: E0930 17:43:39.237839 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:39 crc kubenswrapper[4797]: E0930 17:43:39.238057 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:39 crc kubenswrapper[4797]: E0930 17:43:39.238395 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.283002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.283059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.283074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.283097 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.283111 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.386063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.386143 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.386168 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.386203 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.386223 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.488913 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.488958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.488971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.488993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.489008 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.592083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.592131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.592146 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.592165 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.592178 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.695043 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.695118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.695172 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.695228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.695279 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.798782 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.798891 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.798923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.798955 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.798977 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.901601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.901651 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.901661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.901678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:39 crc kubenswrapper[4797]: I0930 17:43:39.901688 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:39Z","lastTransitionTime":"2025-09-30T17:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.003889 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.003960 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.003979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.004006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.004025 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.107381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.107520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.107548 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.107585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.107614 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.210279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.210355 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.210374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.210394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.210405 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.238027 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:40 crc kubenswrapper[4797]: E0930 17:43:40.238208 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.257727 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.275317 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.289396 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.303578 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.312927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.312990 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.313003 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.313020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.313033 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.318748 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.333630 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.347959 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.362080 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.378477 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.396346 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.413944 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.415570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.415617 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.415629 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.415649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.415663 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.427216 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.451542 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.473321 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.491908 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.516284 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.518383 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.518502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.518548 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.518576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.518591 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.551543 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.570258 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.621729 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.621778 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.621788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.621808 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.621823 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.702184 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/0.log" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.702231 4797 generic.go:334] "Generic (PLEG): container finished" podID="aba20a5a-9a27-4df1-899d-a107aef7a231" containerID="df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea" exitCode=1 Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.702266 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerDied","Data":"df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.702701 4797 scope.go:117] "RemoveContainer" containerID="df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.716245 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.724292 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.724459 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.724543 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.724658 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.724745 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.744290 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.760685 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.777658 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.795548 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.819327 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.831648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.831683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.831692 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.831710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.831720 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.834547 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.851086 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.866252 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.880254 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.897022 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.915202 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.933778 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.934083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.934103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.934111 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.934125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.934135 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:40Z","lastTransitionTime":"2025-09-30T17:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.954053 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.971642 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:40 crc kubenswrapper[4797]: I0930 17:43:40.987992 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:40Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.006923 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.019545 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.036306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.036337 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.036346 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.036362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.036370 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.138545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.138594 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.138606 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.138623 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.138643 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.237344 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:41 crc kubenswrapper[4797]: E0930 17:43:41.237518 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.237772 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.237824 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:41 crc kubenswrapper[4797]: E0930 17:43:41.237916 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:41 crc kubenswrapper[4797]: E0930 17:43:41.238095 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.240926 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.240984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.241002 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.241017 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.241028 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.342992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.343036 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.343047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.343067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.343079 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.445769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.445810 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.445821 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.445841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.445852 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.549618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.549680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.549695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.549717 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.549731 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.652771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.652851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.652873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.652904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.652927 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.707501 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/0.log" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.707570 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerStarted","Data":"80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.721230 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.736577 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.747078 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.755374 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.755415 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.755426 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.755459 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.755472 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.760425 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.775819 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.790357 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.805597 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.820386 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.842349 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.855863 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.858670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.858709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.858721 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.858745 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.858758 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.887118 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.902687 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.915708 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.929219 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.943359 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.959516 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.960905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.960953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.960963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.960982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.960994 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:41Z","lastTransitionTime":"2025-09-30T17:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.975291 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:41 crc kubenswrapper[4797]: I0930 17:43:41.987138 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.063400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.063473 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.063487 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.063507 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.063520 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.166182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.166243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.166256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.166275 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.166290 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.238284 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.238606 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.268893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.268935 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.268947 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.268965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.268978 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.333136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.333212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.333230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.333265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.333292 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.355887 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.361038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.361112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.361130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.361157 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.361176 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.379910 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.384663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.384720 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.384738 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.384764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.384781 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.404202 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.409114 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.409151 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.409160 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.409176 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.409187 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.427581 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.432424 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.432512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.432662 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.432702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.432721 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.452650 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:42 crc kubenswrapper[4797]: E0930 17:43:42.452770 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.455093 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.455131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.455141 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.455159 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.455171 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.557863 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.557910 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.557920 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.557937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.557948 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.660714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.660761 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.660772 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.660789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.660801 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.763456 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.763512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.763524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.763560 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.763570 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.866335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.866389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.866402 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.866447 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.866464 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.968840 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.968931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.968945 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.968966 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:42 crc kubenswrapper[4797]: I0930 17:43:42.968977 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:42Z","lastTransitionTime":"2025-09-30T17:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.072622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.072679 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.072693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.072714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.072727 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.175683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.175875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.175975 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.176083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.176182 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.237600 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.237677 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.237795 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:43 crc kubenswrapper[4797]: E0930 17:43:43.237935 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:43 crc kubenswrapper[4797]: E0930 17:43:43.237955 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:43 crc kubenswrapper[4797]: E0930 17:43:43.238095 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.278457 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.278584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.278673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.278770 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.279050 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.381982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.382022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.382033 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.382048 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.382058 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.485318 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.485373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.485391 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.485420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.485466 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.595210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.596786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.596970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.597479 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.597741 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.700412 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.700488 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.700503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.700528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.700542 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.802421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.802492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.802503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.802521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.802531 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.905618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.905688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.905708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.905736 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:43 crc kubenswrapper[4797]: I0930 17:43:43.905754 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:43Z","lastTransitionTime":"2025-09-30T17:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.008626 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.008670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.008682 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.008700 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.008713 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.112539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.112612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.112630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.112655 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.112674 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.215524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.215597 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.215619 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.215650 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.215675 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.237611 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:44 crc kubenswrapper[4797]: E0930 17:43:44.237810 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.319601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.319676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.319707 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.319741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.319763 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.423971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.424054 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.424078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.424109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.424134 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.528068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.528152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.528182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.528220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.528247 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.631430 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.631589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.631620 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.631648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.631666 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.735342 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.735428 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.735491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.735532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.735556 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.838407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.838496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.838517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.838556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.838594 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.941965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.942028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.942046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.942073 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:44 crc kubenswrapper[4797]: I0930 17:43:44.942090 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:44Z","lastTransitionTime":"2025-09-30T17:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.045180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.045258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.045279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.045308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.045327 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.148994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.149125 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.149143 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.149170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.149189 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.237997 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.238051 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.238050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:45 crc kubenswrapper[4797]: E0930 17:43:45.238341 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:45 crc kubenswrapper[4797]: E0930 17:43:45.238397 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:45 crc kubenswrapper[4797]: E0930 17:43:45.238604 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.252841 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.252910 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.252937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.252969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.252992 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.363905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.363992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.365230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.365281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.365303 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.469237 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.469295 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.469319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.469352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.469375 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.572533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.572606 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.572630 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.572666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.572685 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.676302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.676388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.676410 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.676470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.676491 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.779294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.779373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.779391 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.779418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.779464 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.882213 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.882299 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.882313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.882333 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:45 crc kubenswrapper[4797]: I0930 17:43:45.882348 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:45Z","lastTransitionTime":"2025-09-30T17:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.286498 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.286498 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:46 crc kubenswrapper[4797]: E0930 17:43:46.286643 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:46 crc kubenswrapper[4797]: E0930 17:43:46.286687 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.287646 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.287678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.287687 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.287704 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.287713 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.391116 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.391183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.391200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.391232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.391285 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.494170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.494238 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.494259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.494291 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.494310 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.598120 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.598182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.598198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.598225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.598243 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.701590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.701657 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.701675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.701701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.701720 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.805733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.805944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.806070 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.806159 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.806234 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.910019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.910068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.910085 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.910109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:46 crc kubenswrapper[4797]: I0930 17:43:46.910125 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:46Z","lastTransitionTime":"2025-09-30T17:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.012663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.012701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.012713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.012732 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.012744 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.115909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.116385 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.116670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.116879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.117027 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.220681 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.220752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.220772 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.220804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.220824 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.237104 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.237346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:47 crc kubenswrapper[4797]: E0930 17:43:47.237349 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:47 crc kubenswrapper[4797]: E0930 17:43:47.237902 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.324871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.325650 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.325693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.325734 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.325752 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.428998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.429083 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.429103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.429130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.429150 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.532674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.532760 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.532784 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.532816 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.532843 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.636354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.636421 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.636472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.636498 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.636516 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.739384 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.739468 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.739487 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.739510 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.739529 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.842541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.842600 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.842618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.842641 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.842658 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.946195 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.946271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.946290 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.946316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:47 crc kubenswrapper[4797]: I0930 17:43:47.946335 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:47Z","lastTransitionTime":"2025-09-30T17:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.049713 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.049787 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.049804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.049835 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.049854 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.153362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.153416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.153470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.153503 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.153528 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.237750 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.238654 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:48 crc kubenswrapper[4797]: E0930 17:43:48.239155 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:48 crc kubenswrapper[4797]: E0930 17:43:48.239617 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.257854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.257905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.257922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.257948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.257968 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.258666 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.361842 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.361894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.361911 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.361933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.361950 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.465119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.465187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.465211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.465245 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.465267 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.568817 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.568892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.568914 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.568946 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.568968 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.672187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.672252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.672278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.672312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.672334 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.775408 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.775507 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.775563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.775595 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.775617 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.879481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.879588 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.879608 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.879637 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.879659 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.982908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.982973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.982992 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.983026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:48 crc kubenswrapper[4797]: I0930 17:43:48.983046 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:48Z","lastTransitionTime":"2025-09-30T17:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.086316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.086364 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.086382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.086406 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.086425 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.189822 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.189896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.189919 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.189953 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.189975 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.237979 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.238036 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:49 crc kubenswrapper[4797]: E0930 17:43:49.238159 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:49 crc kubenswrapper[4797]: E0930 17:43:49.238279 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.292925 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.292995 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.293020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.293050 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.293074 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.395897 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.395949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.395970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.395997 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.396016 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.498889 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.498950 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.498967 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.498994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.499013 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.601152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.601216 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.601229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.601252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.601267 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.705069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.705139 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.705161 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.705189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.705209 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.808883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.808941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.808961 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.808987 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.809005 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.913860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.913909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.913922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.913941 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:49 crc kubenswrapper[4797]: I0930 17:43:49.913954 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:49Z","lastTransitionTime":"2025-09-30T17:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.017024 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.017121 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.017147 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.017181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.017204 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.121196 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.121264 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.121280 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.121307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.121327 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.224771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.224843 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.224861 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.224893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.224912 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.237489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.237991 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:50 crc kubenswrapper[4797]: E0930 17:43:50.237969 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:50 crc kubenswrapper[4797]: E0930 17:43:50.238687 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.239146 4797 scope.go:117] "RemoveContainer" containerID="f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.261987 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.284526 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.306458 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.328723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.328806 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.328822 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.328845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.328888 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.335734 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.354835 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.379456 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.399139 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.412476 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.425677 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.431283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.431400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.431520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.431620 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.431704 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.440862 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.456692 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.473076 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.488974 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.517640 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.529560 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.533627 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.533659 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.533670 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.533689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.533701 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.542947 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4796397d-04da-4dd1-8122-1f8ac0b8b8cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.571298 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.590641 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.612312 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.636703 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.636744 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.636754 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.636773 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.636785 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.739918 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.739995 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.740022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.740055 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.740079 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.747743 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/2.log" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.751949 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.752616 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.787527 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.813825 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.841658 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.843138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.843258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.843283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.843319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.843343 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.862867 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.893542 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.902247 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.911480 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.929058 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.944244 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.945810 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.945874 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.945896 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.945928 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.945950 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:50Z","lastTransitionTime":"2025-09-30T17:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.966128 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:50 crc kubenswrapper[4797]: I0930 17:43:50.981047 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.006970 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.017942 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.031753 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4796397d-04da-4dd1-8122-1f8ac0b8b8cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.044989 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.049373 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.049583 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.049625 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.049757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.049821 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.058592 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.071521 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.083291 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.095257 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.152746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.152783 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.152793 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.152810 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.152819 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.237730 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.237816 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:51 crc kubenswrapper[4797]: E0930 17:43:51.238002 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:51 crc kubenswrapper[4797]: E0930 17:43:51.238204 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.256273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.256324 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.256341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.256369 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.256387 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.359847 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.359881 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.359892 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.359909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.359922 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.462647 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.462722 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.462746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.462779 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.462803 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.566589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.566649 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.566667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.566693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.566712 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.670218 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.670305 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.670327 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.670362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.670385 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.762354 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/3.log" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.763370 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/2.log" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.767783 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" exitCode=1 Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.767848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.767917 4797 scope.go:117] "RemoveContainer" containerID="f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.768873 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:43:51 crc kubenswrapper[4797]: E0930 17:43:51.769142 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.773192 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.773282 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.773307 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.773335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.773352 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.785657 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.802517 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4796397d-04da-4dd1-8122-1f8ac0b8b8cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.825730 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.849418 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.869772 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.875933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.875988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.876006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.876035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.876054 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.891577 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.925193 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ca685eed7ab5fc5e796fa14725f37c59aa5a5b2a8ea93515c60e44f38d82c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:18Z\\\",\\\"message\\\":\\\"ng *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 17:43:18.258097 6461 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 17:43:18.258126 6461 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 17:43:18.258204 6461 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 17:43:18.258220 6461 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 17:43:18.258248 6461 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0930 17:43:18.258259 6461 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 17:43:18.258257 6461 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 17:43:18.258313 6461 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 17:43:18.258262 6461 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 17:43:18.258373 6461 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 17:43:18.258464 6461 factory.go:656] Stopping watch factory\\\\nI0930 17:43:18.258487 6461 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:43:18.258483 6461 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 17:43:18.258539 6461 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:43:18.258548 6461 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:43:18.258680 6461 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:51Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 17:43:51.357056 6853 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:43:51.354925 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.948928 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.969266 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.981197 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.981616 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.981781 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.981955 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.982184 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:51Z","lastTransitionTime":"2025-09-30T17:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:51 crc kubenswrapper[4797]: I0930 17:43:51.989130 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.007855 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.028020 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.052823 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.081706 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.087329 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.087367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.087382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.087407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.087425 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.107542 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.129150 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.148337 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.172087 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.188088 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.189617 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.189675 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.189695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.189721 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.189740 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.237905 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.237918 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.238082 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.238084 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.293019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.293087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.293108 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.293138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.293162 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.396890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.396949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.396965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.396993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.397011 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.500281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.500354 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.500372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.500400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.500419 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.603316 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.603376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.603394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.603422 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.603463 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.706837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.706905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.706922 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.706952 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.706969 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.759189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.759249 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.759269 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.759297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.759319 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.789278 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/3.log" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.794800 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.795068 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.796709 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.804469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.804535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.804552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.804577 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.804597 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.830850 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.831058 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.837587 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.837664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.837680 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.837704 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.837723 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.847396 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.856236 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.860303 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.860330 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.860338 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.860356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.860365 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.868081 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.878363 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.881407 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.882702 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.882735 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.882746 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.882766 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.882779 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.895465 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.901559 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: E0930 17:43:52.901908 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.903472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.903604 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.903690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.903784 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.903860 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:52Z","lastTransitionTime":"2025-09-30T17:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.915227 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.927995 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.941505 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.954307 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.970301 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:51Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 17:43:51.357056 6853 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:43:51.354925 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.982102 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:52 crc kubenswrapper[4797]: I0930 17:43:52.992423 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4796397d-04da-4dd1-8122-1f8ac0b8b8cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:52Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.006624 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.006673 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.006686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.006706 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.006718 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.009687 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.023676 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.034742 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.047935 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.062155 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.076383 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.096034 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:43:53Z is after 2025-08-24T17:21:41Z" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.109202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.109260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.109272 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.109294 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.109308 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241051 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241212 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: E0930 17:43:53.241259 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241265 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241305 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.241166 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:53 crc kubenswrapper[4797]: E0930 17:43:53.242149 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.344689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.344760 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.344779 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.344807 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.344828 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.448284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.448349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.448393 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.448419 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.448477 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.551991 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.552059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.552077 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.552103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.552122 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.654389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.654502 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.654523 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.654551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.654571 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.758210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.758312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.758331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.758358 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.758386 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.861475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.861537 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.861556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.861584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.861605 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.965061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.965118 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.965131 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.965152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:53 crc kubenswrapper[4797]: I0930 17:43:53.965170 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:53Z","lastTransitionTime":"2025-09-30T17:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.068106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.068182 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.068202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.068231 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.068255 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.171983 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.172087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.172107 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.172133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.172151 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.237227 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.237312 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:54 crc kubenswrapper[4797]: E0930 17:43:54.237501 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:54 crc kubenswrapper[4797]: E0930 17:43:54.237651 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.275685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.275733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.275747 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.275769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.275786 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.379998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.380074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.380100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.380133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.380157 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.482926 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.482986 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.483001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.483023 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.483036 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.586837 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.586895 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.586905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.586923 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.586936 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.690327 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.690403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.690420 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.690472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.690491 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.795200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.795271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.795289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.795315 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.795333 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.898276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.898352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.898367 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.898394 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:54 crc kubenswrapper[4797]: I0930 17:43:54.898410 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:54Z","lastTransitionTime":"2025-09-30T17:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.001301 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.001356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.001372 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.001399 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.001417 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.101269 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.101551 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.101507066 +0000 UTC m=+149.624006374 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.101624 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.101864 4797 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.101971 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.101953997 +0000 UTC m=+149.624453245 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.104201 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.104252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.104270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.104297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.104312 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.202904 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.202968 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.203027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203200 4797 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203294 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.203269332 +0000 UTC m=+149.725768610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203286 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203359 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203376 4797 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203479 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203594 4797 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203624 4797 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203506 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.203478568 +0000 UTC m=+149.725977806 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.203752 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.203714264 +0000 UTC m=+149.726213662 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.206814 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.206873 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.206891 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.206931 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.206950 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.237406 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.237509 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.237710 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:55 crc kubenswrapper[4797]: E0930 17:43:55.237889 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.310012 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.310081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.310093 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.310112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.310125 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.412621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.412664 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.412676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.412690 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.412700 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.515924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.515981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.515999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.516022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.516038 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.619081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.619152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.619170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.619198 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.619222 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.722395 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.722498 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.722520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.722551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.722571 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.825645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.825714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.825808 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.825921 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.825963 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.928758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.928821 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.928845 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.928878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:55 crc kubenswrapper[4797]: I0930 17:43:55.928902 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:55Z","lastTransitionTime":"2025-09-30T17:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.032737 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.032796 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.032809 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.032831 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.032845 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.135983 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.136062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.136085 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.136114 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.136170 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.237354 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.237549 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:56 crc kubenswrapper[4797]: E0930 17:43:56.237654 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:56 crc kubenswrapper[4797]: E0930 17:43:56.237801 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.239361 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.239418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.239483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.239517 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.239541 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.342817 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.342902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.342915 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.342937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.342955 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.446250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.446341 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.446362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.446392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.446413 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.550475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.550539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.550556 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.550589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.550654 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.654885 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.654958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.654981 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.655020 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.655044 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.758984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.759074 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.759094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.759127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.759147 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.862928 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.863021 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.863047 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.863086 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.863110 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.966612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.966677 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.966727 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.966752 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:56 crc kubenswrapper[4797]: I0930 17:43:56.966770 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:56Z","lastTransitionTime":"2025-09-30T17:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.069703 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.069780 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.069798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.069833 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.069853 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.174039 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.174142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.174169 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.174205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.174229 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.237054 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.237084 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:57 crc kubenswrapper[4797]: E0930 17:43:57.237241 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:57 crc kubenswrapper[4797]: E0930 17:43:57.237487 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.278185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.278260 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.278279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.278308 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.278329 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.381799 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.381861 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.381879 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.381904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.381922 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.484829 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.484893 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.484911 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.484943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.484967 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.588733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.588798 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.588815 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.588843 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.588862 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.692425 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.692524 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.692541 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.692568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.692586 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.795882 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.795977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.795999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.796031 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.796054 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.899654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.899714 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.899733 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.899758 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:57 crc kubenswrapper[4797]: I0930 17:43:57.899777 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:57Z","lastTransitionTime":"2025-09-30T17:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.002875 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.002943 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.002961 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.002986 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.003004 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.106587 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.106645 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.106663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.106686 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.106702 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.209258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.209313 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.209335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.209364 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.209385 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.238065 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.238196 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:43:58 crc kubenswrapper[4797]: E0930 17:43:58.238338 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:43:58 crc kubenswrapper[4797]: E0930 17:43:58.238740 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.313850 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.313924 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.313949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.313979 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.314003 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.417791 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.417854 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.417886 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.417927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.417952 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.521276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.521350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.521377 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.521417 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.521481 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.624999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.625067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.625094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.625122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.625140 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.728871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.728946 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.728967 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.728998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.729019 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.831612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.831671 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.831695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.831731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.831753 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.934701 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.934757 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.934776 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.934801 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:58 crc kubenswrapper[4797]: I0930 17:43:58.934819 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:58Z","lastTransitionTime":"2025-09-30T17:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.037570 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.037644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.037663 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.037695 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.037741 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.140677 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.140749 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.140771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.140800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.140818 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.237893 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.237940 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:43:59 crc kubenswrapper[4797]: E0930 17:43:59.238090 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:43:59 crc kubenswrapper[4797]: E0930 17:43:59.238238 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.243001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.243062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.243085 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.243112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.243133 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.345545 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.345654 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.345678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.345708 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.345731 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.449154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.449225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.449250 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.449290 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.449314 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.553965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.554022 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.554038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.554063 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.554081 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.657035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.657105 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.657123 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.657155 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.657179 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.763806 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.764211 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.764228 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.764254 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.764274 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.867038 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.867096 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.867112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.867137 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.867157 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.969993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.970072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.970095 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.970127 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:43:59 crc kubenswrapper[4797]: I0930 17:43:59.970150 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:43:59Z","lastTransitionTime":"2025-09-30T17:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.073851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.073918 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.073936 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.073965 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.073984 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.177423 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.177544 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.177561 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.177587 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.177606 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.237223 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.237358 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:00 crc kubenswrapper[4797]: E0930 17:44:00.237508 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:00 crc kubenswrapper[4797]: E0930 17:44:00.237570 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.264611 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.281389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.281809 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.282064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.282281 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.282544 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.287341 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.306505 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.330002 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.349650 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.376705 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.386259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.386532 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.386705 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.386877 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.387045 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.397856 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.415660 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4796397d-04da-4dd1-8122-1f8ac0b8b8cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.459836 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.479939 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.490194 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.490306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.490332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.490405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.490428 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.501916 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.524654 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.556268 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:51Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 17:43:51.357056 6853 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:43:51.354925 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.571430 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.592156 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.595109 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.595181 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.595202 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.595232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.595250 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.615797 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.637607 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.656725 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.675782 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.697878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.697998 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.698058 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.698121 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.698184 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.801750 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.801819 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.801838 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.801866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.801884 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.905154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.905224 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.905248 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.905286 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:00 crc kubenswrapper[4797]: I0930 17:44:00.905312 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:00Z","lastTransitionTime":"2025-09-30T17:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.009044 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.009110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.009135 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.009173 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.009199 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.112482 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.112554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.112576 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.112607 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.112628 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.216388 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.216511 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.216540 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.216595 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.216617 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.237083 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.237271 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:01 crc kubenswrapper[4797]: E0930 17:44:01.237530 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:01 crc kubenswrapper[4797]: E0930 17:44:01.237750 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.319857 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.319948 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.319974 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.320008 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.320031 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.423819 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.423890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.423907 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.423933 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.423951 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.526806 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.526881 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.526899 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.526927 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.526998 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.630276 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.630356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.630376 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.630402 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.630421 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.733589 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.733667 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.733694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.733728 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.733753 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.836693 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.836769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.836788 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.836816 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.836835 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.939870 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.939929 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.939949 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.939977 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:01 crc kubenswrapper[4797]: I0930 17:44:01.939995 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:01Z","lastTransitionTime":"2025-09-30T17:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.043287 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.043389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.043418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.043486 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.043515 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.147075 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.147136 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.147154 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.147185 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.147204 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.237565 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.237579 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:02 crc kubenswrapper[4797]: E0930 17:44:02.237822 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:02 crc kubenswrapper[4797]: E0930 17:44:02.237905 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.250820 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.250887 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.250904 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.250938 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.250961 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.353726 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.353803 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.353828 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.353860 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.353882 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.456959 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.456999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.457010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.457028 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.457040 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.559804 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.559874 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.559894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.559920 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.559938 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.663909 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.663968 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.664027 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.664061 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.664082 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.767191 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.767328 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.767402 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.767465 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.767490 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.870481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.870549 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.870568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.870593 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.870611 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.928808 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.928862 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.928878 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.928903 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.928920 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: E0930 17:44:02.950980 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.958094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.958186 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.958205 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.958262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.958281 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:02 crc kubenswrapper[4797]: E0930 17:44:02.980524 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.987258 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.987334 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.987352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.987378 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:02 crc kubenswrapper[4797]: I0930 17:44:02.987395 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:02Z","lastTransitionTime":"2025-09-30T17:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: E0930 17:44:03.007597 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.014115 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.014546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.014599 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.014871 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.015429 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: E0930 17:44:03.036208 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.042019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.042088 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.042110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.042142 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.042160 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: E0930 17:44:03.063871 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:03 crc kubenswrapper[4797]: E0930 17:44:03.064207 4797 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.066403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.066492 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.066514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.066573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.066591 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.169976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.170052 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.170069 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.170098 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.170118 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.237691 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.237749 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:03 crc kubenswrapper[4797]: E0930 17:44:03.237860 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:03 crc kubenswrapper[4797]: E0930 17:44:03.237982 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.273542 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.273584 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.273601 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.273624 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.273642 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.376036 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.376087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.376100 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.376122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.376146 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.479689 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.479771 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.479789 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.479818 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.479837 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.582894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.582958 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.582976 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.583004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.583023 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.685940 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.686006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.686029 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.686057 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.686077 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.789256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.789331 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.789352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.789386 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.789405 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.893226 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.893284 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.893302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.893335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.893353 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.996279 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.996335 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.996352 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.996377 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:03 crc kubenswrapper[4797]: I0930 17:44:03.996398 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:03Z","lastTransitionTime":"2025-09-30T17:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.099997 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.100051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.100068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.100092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.100110 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.202531 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.202596 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.202613 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.202640 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.202662 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.237470 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:04 crc kubenswrapper[4797]: E0930 17:44:04.237678 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.237772 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:04 crc kubenswrapper[4797]: E0930 17:44:04.238596 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.239107 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:44:04 crc kubenswrapper[4797]: E0930 17:44:04.239427 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.306201 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.306267 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.306285 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.306312 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.306333 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.409636 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.409706 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.409723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.409750 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.409769 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.513619 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.513676 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.513694 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.513723 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.513744 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.616993 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.617050 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.617060 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.617078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.617111 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.720271 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.720344 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.720361 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.720389 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.720406 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.823180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.823240 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.823256 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.823278 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.823294 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.926688 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.926751 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.926769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.926800 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:04 crc kubenswrapper[4797]: I0930 17:44:04.926819 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:04Z","lastTransitionTime":"2025-09-30T17:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.030026 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.030080 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.030092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.030114 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.030128 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.133488 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.133551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.133568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.133594 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.133613 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237052 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237164 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237051 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:05 crc kubenswrapper[4797]: E0930 17:44:05.237273 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237208 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237461 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237483 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.237492 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: E0930 17:44:05.237555 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.341289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.341361 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.341381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.341414 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.341467 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.444911 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.444963 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.444973 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.444994 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.445004 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.548169 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.548207 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.548215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.548232 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.548242 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.651622 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.651674 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.651685 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.651710 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.651725 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.755368 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.755475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.755494 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.755522 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.755540 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.858851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.858920 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.858937 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.858969 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.858987 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.962230 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.962302 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.962320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.962347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:05 crc kubenswrapper[4797]: I0930 17:44:05.962366 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:05Z","lastTransitionTime":"2025-09-30T17:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.065838 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.065901 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.065919 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.065947 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.065965 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.169731 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.169799 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.169816 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.169842 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.169864 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.237881 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.238072 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:06 crc kubenswrapper[4797]: E0930 17:44:06.238337 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:06 crc kubenswrapper[4797]: E0930 17:44:06.238705 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.273481 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.273566 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.273585 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.273615 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.273636 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.377209 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.377400 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.377510 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.377643 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.377728 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.481262 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.481362 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.481382 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.481405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.481422 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.584818 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.584868 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.584884 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.584905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.584921 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.687962 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.688051 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.688068 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.688096 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.688116 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.791484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.791554 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.791572 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.791606 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.791624 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.894920 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.895004 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.895033 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.895072 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:06 crc kubenswrapper[4797]: I0930 17:44:06.895101 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:06Z","lastTransitionTime":"2025-09-30T17:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.044984 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.045087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.045106 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.045139 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.045157 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.148521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.148591 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.148611 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.148637 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.148656 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.237478 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:07 crc kubenswrapper[4797]: E0930 17:44:07.237743 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.237891 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:07 crc kubenswrapper[4797]: E0930 17:44:07.238185 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.252092 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.252167 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.252203 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.252237 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.252260 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.354699 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.354763 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.354785 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.354807 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.354823 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.456890 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.456956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.456980 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.457010 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.457032 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.566908 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.566988 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.567012 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.567046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.567073 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.670982 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.671064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.671088 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.671119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.671141 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.774365 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.774426 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.774482 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.774514 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.774540 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.878084 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.878145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.878163 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.878187 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.878205 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.981001 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.981064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.981081 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.981108 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:07 crc kubenswrapper[4797]: I0930 17:44:07.981126 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:07Z","lastTransitionTime":"2025-09-30T17:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.084122 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.084206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.084229 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.084261 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.084283 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.187902 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.188016 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.188035 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.188060 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.188078 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.237320 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.237531 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:08 crc kubenswrapper[4797]: E0930 17:44:08.237601 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:08 crc kubenswrapper[4797]: E0930 17:44:08.237764 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.290539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.290595 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.290613 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.290634 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.290654 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.394535 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.394648 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.394678 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.394709 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.394732 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.498172 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.498242 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.498259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.498287 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.498305 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.601027 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.601087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.601112 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.601145 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.601201 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.704538 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.704590 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.704612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.704644 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.704666 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.808116 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.808200 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.808234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.808266 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.808287 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.912770 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.913215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.913274 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.913306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:08 crc kubenswrapper[4797]: I0930 17:44:08.913326 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:08Z","lastTransitionTime":"2025-09-30T17:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.017241 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.017306 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.017323 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.017349 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.017366 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.120568 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.120661 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.120683 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.120715 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.120735 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.224263 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.224339 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.224356 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.224381 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.224400 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.237940 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.238106 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:09 crc kubenswrapper[4797]: E0930 17:44:09.238532 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:09 crc kubenswrapper[4797]: E0930 17:44:09.238880 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.327883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.328006 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.328085 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.328210 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.328232 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.431970 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.432064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.432087 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.432119 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.432142 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.535741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.535818 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.535846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.535880 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.535905 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.639887 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.639956 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.639971 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.639999 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.640015 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.743557 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.743618 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.743637 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.743666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.743686 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.846972 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.847049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.847067 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.847094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.847114 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.950319 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.950407 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.950464 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.950494 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:09 crc kubenswrapper[4797]: I0930 17:44:09.950517 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:09Z","lastTransitionTime":"2025-09-30T17:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.053764 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.053846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.053866 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.053894 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.053912 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.157094 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.157180 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.157199 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.157227 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.157246 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.238069 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:10 crc kubenswrapper[4797]: E0930 17:44:10.238265 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.238673 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:10 crc kubenswrapper[4797]: E0930 17:44:10.238869 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.256594 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4796397d-04da-4dd1-8122-1f8ac0b8b8cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde9e1682748fabe8cec6e79c5bccf77ea10d314176b06488a7499e598bcc0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f0194def0b31170f584b3924c0815f12360b8d24792cc29135a8d862719d56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.260030 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.260110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.260133 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.260156 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.260172 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.291233 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d0bd54-2348-46f4-946c-433d3efd31b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a024cbb4ade6845992e23a551cb2943af6284a0cedb1a239f1628cc0d3397dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://950cb623e3a538d6a7bc2f92a8ec4f5c3da04849d60cd81ad20d8eb3238bebef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e986d027f96e98f891825b81826e19c41a6db11dfcc51c30e6e51d63595cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26add6e5fa14413e3d971ea0257d6a84b59244e3b6d0d8cff65fed8497c34e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0dbea40a1ca6340a27bd5ac9820020128f8edc39f31c9d439d8f63c9a2e501b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0997d5262a94294a689c4af0e8e5e8f545ceb31a106d74c44825a5a14657254c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea65c804f51c688cb1cfecd9e7d5e7990ce9c43bea321075255d10e0c35ffb63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d60184de500709fc690f61ec298f1af82204bb778f17e38392def51b5bfc716\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.315381 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55493d9-a25d-4953-880d-a03401a2d4eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://517d3f55a8be7d10ed92a3dc9dfa571708854e443ba0e8ca1c70c1ac60a8cc03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e227d1f3831a246e344333e0911470ab31c8c290e29deeae3828788c7d77b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af74e83ffea1020bda35cc8a616b0eb30665e3b1767600df8b1b95df994c48e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd32cc163eef00734864635bc47a5a20489ad955c52161d6d31d3a477a6a8458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.335783 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.358384 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w74xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba20a5a-9a27-4df1-899d-a107aef7a231\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:40Z\\\",\\\"message\\\":\\\"2025-09-30T17:42:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8\\\\n2025-09-30T17:42:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cbb670a2-4918-419b-a4b6-ae75e9cc44f8 to /host/opt/cni/bin/\\\\n2025-09-30T17:42:55Z [verbose] multus-daemon started\\\\n2025-09-30T17:42:55Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:43:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmnnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w74xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.363404 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.363475 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.363489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.363512 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.363529 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.398062 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c749a60-66ac-44d6-955f-a3d050b12758\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:43:51Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 17:43:51.357056 6853 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 17:43:51.354925 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:43:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t72jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g447b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.414334 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5658q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c2c98-8568-4b97-bc8c-13161ad0c7c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82e2e436b2986b7490d3f18d09ee20773ccf5c5210e2c7392ea753a1c9e47095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxtzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5658q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.435774 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.456021 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.467009 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.467049 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.467062 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.467085 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.467101 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.475134 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde70173e5bedd26cc4c5a8331405d745fe8483ce3b4bb8c9ef5933aa3be8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.494289 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hprkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02901280-033f-4eb8-91bd-c1a5ba1358c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2833d39b8afc156e6254d002785a0210b3916a9372fede04ce41b8335ed8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcwpd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hprkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.513379 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d03b039-cc5b-4e6d-ad02-c41e8b60004f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25af67128ab71d0a87eb7114b93f4e17cb14880cd8bb09d15a06bd86bd280ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1473c6c16a895dad835eae8bfaf3f1aafdfdea87c6eda9a849fde4bd7a4a820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:43:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2k97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xq2hl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.538956 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb19c553-a86d-4e58-9f1d-96aff21a7769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6eb37c0c80b8b3194e9859e6d77ebbc61e58f50c2eb370bf667c841b9495136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff01527322cf3b3a2a4d773e4fc56abd4b8aba00f19a3eda908fddcb1a62745\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6015ae6f7b50585321b98b17e71121d2b12c8dceaddb6d74697cbe47740269\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2fd4f9da8eb3390c91b9d5f8b4afc55f65f10250912510e20b9cd22a779517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f682c89a12e74af01e75b0c96c7a46e0c109fcf269196f6a643828f96ef777e5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:42:50Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759254164\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759254164\\\\\\\\\\\\\\\" (2025-09-30 16:42:44 +0000 UTC to 2026-09-30 16:42:44 +0000 UTC (now=2025-09-30 17:42:50.665397565 +0000 UTC))\\\\\\\"\\\\nI0930 17:42:50.665463 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 17:42:50.665486 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0930 17:42:50.665511 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665531 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0930 17:42:50.665601 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2344784656/tls.crt::/tmp/serving-cert-2344784656/tls.key\\\\\\\"\\\\nI0930 17:42:50.665634 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0930 17:42:50.665745 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665764 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0930 17:42:50.665782 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0930 17:42:50.665788 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0930 17:42:50.665877 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0930 17:42:50.665895 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0930 17:42:50.666638 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6fdd9ea88fe478572b11ff857ddf392eec109373f371081e8e4b79173a4edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d0be89466f1ac908d8ab46d6597f001003b75a092c19a2286139980108e0d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.566352 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06bb67a9fa125547ff7c15be6de0f690dff67b21d4016173f32f72d5fb07b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717f19debcbd8462396e6093d5b14a9fb2f8d758375e496b5adcc12513d9a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.571019 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.571086 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.571110 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.571144 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.571325 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.593328 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:10 crc kubenswrapper[4797]: E0930 17:44:10.593596 4797 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.593644 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d69329-116b-4e67-9b30-979bf7309128\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://816b1f572dba8fe922759e909cca6ab41a9446a46c349f61f9cac8b232220e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa7019d86a7a94552e63c83b28ac433e43e533fa12f380569c094fb565a79ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775a0c99fbb53a765f4939ccffe3827c150cbb0aa194b70a801911305239efac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ba2c31580bfae882a388ce91949bc24cc0cb56c26138eb77da5e038e12e78d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: E0930 17:44:10.593733 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs podName:d2fc9be6-9aff-4e05-aadc-5a81cbfea32e nodeName:}" failed. No retries permitted until 2025-09-30 17:45:14.593695567 +0000 UTC m=+165.116194845 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs") pod "network-metrics-daemon-rx9f5" (UID: "d2fc9be6-9aff-4e05-aadc-5a81cbfea32e") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.617732 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c60909137e17763e4ba6c5614cdf344e3b40404457093222ec0cc3ba3244acb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.637809 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec455803-9758-4ad4-a627-ce3ad63812c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81db8d913825e16e27873311e670232c38fa0c387c0d8c688789bad05bb405c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rfj8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b8bg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.661717 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5be80c8f-41bd-41be-a86f-8c69e7655592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dff1eff8a585ead2ed06e5bed1127abdbf312006e28c71c001857900df8ea78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d29ec22a13d3e9b5b4f310d1e17a6d569dd6f3e71c758722a37ca7b9b07db2fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adabff269c9b75bb38d29b1b3b08590269bb79bee75731e1375b7034fbbffb1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3121dfeab78c91fdcd2f09b921bf98538788b551283ffa1eddd0331fc51ca65f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369cba029ab5e1b3cab5717443d957ec474141effbc9035dd69327fcd9f6e8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbc0856d90a07122a6fa50c314ba378358f8180bf1f2de164609698d923b9afa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119fbad058344267725ecf5d0f9bdfeda980d0c86ea63d0ef7fc8c3f3f9c50cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5k26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:42:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4zbp6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.673491 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.673533 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.673546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.673564 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.673575 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.678584 4797 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:43:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whrqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:43:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rx9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.776471 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.776551 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.776579 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.776612 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.776635 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.879343 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.879469 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.879489 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.879521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.879539 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.982470 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.982528 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.982546 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.982572 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:10 crc kubenswrapper[4797]: I0930 17:44:10.982590 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:10Z","lastTransitionTime":"2025-09-30T17:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.085928 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.086011 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.086046 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.086078 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.086099 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.190515 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.190602 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.190621 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.190652 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.190672 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.237638 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.237680 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:11 crc kubenswrapper[4797]: E0930 17:44:11.237938 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:11 crc kubenswrapper[4797]: E0930 17:44:11.238070 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.294297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.294350 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.294418 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.294472 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.294490 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.397189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.397243 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.397259 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.397283 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.397301 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.500370 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.500426 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.500458 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.500478 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.500492 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.604225 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.604300 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.604320 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.604347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.604366 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.707496 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.707530 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.707540 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.707555 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.707565 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.811741 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.811846 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.811874 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.811905 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.811923 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.915128 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.915222 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.915244 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.915270 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:11 crc kubenswrapper[4797]: I0930 17:44:11.915290 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:11Z","lastTransitionTime":"2025-09-30T17:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.019073 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.019138 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.019156 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.019183 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.019204 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.122786 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.122851 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.122883 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.122918 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.122940 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.225405 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.225463 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.225476 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.225494 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.225504 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.238106 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.238206 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:12 crc kubenswrapper[4797]: E0930 17:44:12.238320 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:12 crc kubenswrapper[4797]: E0930 17:44:12.238504 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.329103 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.329170 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.329189 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.329220 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.329239 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.432944 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.433012 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.433032 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.433059 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.433078 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.536666 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.536769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.536790 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.536816 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.536837 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.641165 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.641239 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.641252 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.641273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.641295 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.745064 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.745130 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.745152 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.745179 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.745196 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.849332 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.849387 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.849409 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.849467 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.849484 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.953101 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.953206 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.953234 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.953273 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:12 crc kubenswrapper[4797]: I0930 17:44:12.953299 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:12Z","lastTransitionTime":"2025-09-30T17:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.056660 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.056727 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.056743 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.056769 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.056786 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:13Z","lastTransitionTime":"2025-09-30T17:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.160403 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.160520 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.160539 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.160563 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.160582 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:13Z","lastTransitionTime":"2025-09-30T17:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.237649 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.237728 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:13 crc kubenswrapper[4797]: E0930 17:44:13.237832 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:13 crc kubenswrapper[4797]: E0930 17:44:13.238295 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.243416 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.243484 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.243500 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.243521 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.243537 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:13Z","lastTransitionTime":"2025-09-30T17:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: E0930 17:44:13.268077 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.273297 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.273347 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.273364 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.273392 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.273411 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:13Z","lastTransitionTime":"2025-09-30T17:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: E0930 17:44:13.295085 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.300478 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.300552 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.300573 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.300603 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.300626 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:13Z","lastTransitionTime":"2025-09-30T17:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: E0930 17:44:13.323882 4797 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:44:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ffa71b44-8856-40bb-9dd0-c146b8624485\\\",\\\"systemUUID\\\":\\\"8f771605-5354-4577-b1b4-ab7637d1e89f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:44:13Z is after 2025-08-24T17:21:41Z" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.332146 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.332215 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.332242 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.332289 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.332313 4797 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:44:13Z","lastTransitionTime":"2025-09-30T17:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.412899 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md"] Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.414299 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.417988 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.418091 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.418263 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.418780 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.451898 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.451865898 podStartE2EDuration="1m19.451865898s" podCreationTimestamp="2025-09-30 17:42:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.450972955 +0000 UTC m=+103.973472223" watchObservedRunningTime="2025-09-30 17:44:13.451865898 +0000 UTC m=+103.974365176" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.499641 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w74xm" podStartSLOduration=83.499577825 podStartE2EDuration="1m23.499577825s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.499180785 +0000 UTC m=+104.021680063" watchObservedRunningTime="2025-09-30 17:44:13.499577825 +0000 UTC m=+104.022077073" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.529540 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31bb6d07-078a-4851-bff9-ff3083885da8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.529713 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/31bb6d07-078a-4851-bff9-ff3083885da8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.529768 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/31bb6d07-078a-4851-bff9-ff3083885da8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.529875 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31bb6d07-078a-4851-bff9-ff3083885da8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.530034 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31bb6d07-078a-4851-bff9-ff3083885da8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.565464 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5658q" podStartSLOduration=83.565414356 podStartE2EDuration="1m23.565414356s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.548686279 +0000 UTC m=+104.071185567" watchObservedRunningTime="2025-09-30 17:44:13.565414356 +0000 UTC m=+104.087913604" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.565677 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.565666352 podStartE2EDuration="25.565666352s" podCreationTimestamp="2025-09-30 17:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.56559815 +0000 UTC m=+104.088097478" watchObservedRunningTime="2025-09-30 17:44:13.565666352 +0000 UTC m=+104.088165600" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.617227 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.617193136 podStartE2EDuration="1m21.617193136s" podCreationTimestamp="2025-09-30 17:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.614638911 +0000 UTC m=+104.137138159" watchObservedRunningTime="2025-09-30 17:44:13.617193136 +0000 UTC m=+104.139692414" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.630878 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31bb6d07-078a-4851-bff9-ff3083885da8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.630955 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31bb6d07-078a-4851-bff9-ff3083885da8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.631007 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31bb6d07-078a-4851-bff9-ff3083885da8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.631090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/31bb6d07-078a-4851-bff9-ff3083885da8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.631171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/31bb6d07-078a-4851-bff9-ff3083885da8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.631287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/31bb6d07-078a-4851-bff9-ff3083885da8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.631453 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/31bb6d07-078a-4851-bff9-ff3083885da8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.633120 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31bb6d07-078a-4851-bff9-ff3083885da8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.639973 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31bb6d07-078a-4851-bff9-ff3083885da8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.652666 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hprkg" podStartSLOduration=83.652623391 podStartE2EDuration="1m23.652623391s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.651942614 +0000 UTC m=+104.174441872" watchObservedRunningTime="2025-09-30 17:44:13.652623391 +0000 UTC m=+104.175122669" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.660652 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31bb6d07-078a-4851-bff9-ff3083885da8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4m7md\" (UID: \"31bb6d07-078a-4851-bff9-ff3083885da8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.673649 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xq2hl" podStartSLOduration=82.673613436 podStartE2EDuration="1m22.673613436s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.673111323 +0000 UTC m=+104.195610601" watchObservedRunningTime="2025-09-30 17:44:13.673613436 +0000 UTC m=+104.196112714" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.739199 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.759919 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.759898538 podStartE2EDuration="1m22.759898538s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.757842365 +0000 UTC m=+104.280341683" watchObservedRunningTime="2025-09-30 17:44:13.759898538 +0000 UTC m=+104.282397776" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.808454 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podStartSLOduration=83.808419276 podStartE2EDuration="1m23.808419276s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.807930363 +0000 UTC m=+104.330429611" watchObservedRunningTime="2025-09-30 17:44:13.808419276 +0000 UTC m=+104.330918514" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.827201 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4zbp6" podStartSLOduration=83.827174234 podStartE2EDuration="1m23.827174234s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.826732643 +0000 UTC m=+104.349231901" watchObservedRunningTime="2025-09-30 17:44:13.827174234 +0000 UTC m=+104.349673482" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.859325 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.859296885 podStartE2EDuration="54.859296885s" podCreationTimestamp="2025-09-30 17:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:13.857069208 +0000 UTC m=+104.379568496" watchObservedRunningTime="2025-09-30 17:44:13.859296885 +0000 UTC m=+104.381796163" Sep 30 17:44:13 crc kubenswrapper[4797]: I0930 17:44:13.877537 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" event={"ID":"31bb6d07-078a-4851-bff9-ff3083885da8","Type":"ContainerStarted","Data":"8ad1f3584f86e35d77c2300d71ed8b1f4221da7bb014b74d564a3f186b80a651"} Sep 30 17:44:14 crc kubenswrapper[4797]: I0930 17:44:14.237769 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:14 crc kubenswrapper[4797]: I0930 17:44:14.237783 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:14 crc kubenswrapper[4797]: E0930 17:44:14.238016 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:14 crc kubenswrapper[4797]: E0930 17:44:14.238154 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:14 crc kubenswrapper[4797]: I0930 17:44:14.890169 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" event={"ID":"31bb6d07-078a-4851-bff9-ff3083885da8","Type":"ContainerStarted","Data":"bce9d72719fb703efd10c7ac0a824147ff050e74c2f0f20446af569238e5529b"} Sep 30 17:44:15 crc kubenswrapper[4797]: I0930 17:44:15.237757 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:15 crc kubenswrapper[4797]: I0930 17:44:15.237758 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:15 crc kubenswrapper[4797]: E0930 17:44:15.237932 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:15 crc kubenswrapper[4797]: E0930 17:44:15.238001 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:16 crc kubenswrapper[4797]: I0930 17:44:16.237897 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:16 crc kubenswrapper[4797]: I0930 17:44:16.237979 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:16 crc kubenswrapper[4797]: E0930 17:44:16.238209 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:16 crc kubenswrapper[4797]: E0930 17:44:16.238801 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:17 crc kubenswrapper[4797]: I0930 17:44:17.238207 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:17 crc kubenswrapper[4797]: I0930 17:44:17.238242 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:17 crc kubenswrapper[4797]: E0930 17:44:17.238670 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:17 crc kubenswrapper[4797]: I0930 17:44:17.239012 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:44:17 crc kubenswrapper[4797]: E0930 17:44:17.239076 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:17 crc kubenswrapper[4797]: E0930 17:44:17.239207 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:44:18 crc kubenswrapper[4797]: I0930 17:44:18.237198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:18 crc kubenswrapper[4797]: I0930 17:44:18.237251 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:18 crc kubenswrapper[4797]: E0930 17:44:18.237561 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:18 crc kubenswrapper[4797]: E0930 17:44:18.237785 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:19 crc kubenswrapper[4797]: I0930 17:44:19.237532 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:19 crc kubenswrapper[4797]: I0930 17:44:19.237556 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:19 crc kubenswrapper[4797]: E0930 17:44:19.237752 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:19 crc kubenswrapper[4797]: E0930 17:44:19.237937 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:20 crc kubenswrapper[4797]: I0930 17:44:20.237999 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:20 crc kubenswrapper[4797]: I0930 17:44:20.238067 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:20 crc kubenswrapper[4797]: E0930 17:44:20.240305 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:20 crc kubenswrapper[4797]: E0930 17:44:20.240467 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:21 crc kubenswrapper[4797]: I0930 17:44:21.238040 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:21 crc kubenswrapper[4797]: E0930 17:44:21.238252 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:21 crc kubenswrapper[4797]: I0930 17:44:21.239272 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:21 crc kubenswrapper[4797]: E0930 17:44:21.239736 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:22 crc kubenswrapper[4797]: I0930 17:44:22.237515 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:22 crc kubenswrapper[4797]: I0930 17:44:22.237572 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:22 crc kubenswrapper[4797]: E0930 17:44:22.237772 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:22 crc kubenswrapper[4797]: E0930 17:44:22.238078 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:23 crc kubenswrapper[4797]: I0930 17:44:23.237533 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:23 crc kubenswrapper[4797]: I0930 17:44:23.237711 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:23 crc kubenswrapper[4797]: E0930 17:44:23.238003 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:23 crc kubenswrapper[4797]: E0930 17:44:23.238146 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:24 crc kubenswrapper[4797]: I0930 17:44:24.237603 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:24 crc kubenswrapper[4797]: E0930 17:44:24.238136 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:24 crc kubenswrapper[4797]: I0930 17:44:24.237929 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:24 crc kubenswrapper[4797]: E0930 17:44:24.238674 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:25 crc kubenswrapper[4797]: I0930 17:44:25.237486 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:25 crc kubenswrapper[4797]: E0930 17:44:25.237734 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:25 crc kubenswrapper[4797]: I0930 17:44:25.237842 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:25 crc kubenswrapper[4797]: E0930 17:44:25.238091 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.237937 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.238002 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:26 crc kubenswrapper[4797]: E0930 17:44:26.238144 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:26 crc kubenswrapper[4797]: E0930 17:44:26.238295 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.939813 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/1.log" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.940692 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/0.log" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.940898 4797 generic.go:334] "Generic (PLEG): container finished" podID="aba20a5a-9a27-4df1-899d-a107aef7a231" containerID="80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f" exitCode=1 Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.940958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerDied","Data":"80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f"} Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.941045 4797 scope.go:117] "RemoveContainer" containerID="df7b0970e87928b3f880597c39243fb135be52aedf6c14e542a8fbe51acfc6ea" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.942735 4797 scope.go:117] "RemoveContainer" containerID="80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f" Sep 30 17:44:26 crc kubenswrapper[4797]: E0930 17:44:26.943084 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-w74xm_openshift-multus(aba20a5a-9a27-4df1-899d-a107aef7a231)\"" pod="openshift-multus/multus-w74xm" podUID="aba20a5a-9a27-4df1-899d-a107aef7a231" Sep 30 17:44:26 crc kubenswrapper[4797]: I0930 17:44:26.974337 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4m7md" podStartSLOduration=96.974307927 podStartE2EDuration="1m36.974307927s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:14.916264745 +0000 UTC m=+105.438764023" watchObservedRunningTime="2025-09-30 17:44:26.974307927 +0000 UTC m=+117.496807205" Sep 30 17:44:27 crc kubenswrapper[4797]: I0930 17:44:27.237867 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:27 crc kubenswrapper[4797]: I0930 17:44:27.237999 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:27 crc kubenswrapper[4797]: E0930 17:44:27.238149 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:27 crc kubenswrapper[4797]: E0930 17:44:27.238344 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:27 crc kubenswrapper[4797]: I0930 17:44:27.948923 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/1.log" Sep 30 17:44:28 crc kubenswrapper[4797]: I0930 17:44:28.237596 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:28 crc kubenswrapper[4797]: I0930 17:44:28.237590 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:28 crc kubenswrapper[4797]: E0930 17:44:28.237984 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:28 crc kubenswrapper[4797]: E0930 17:44:28.238320 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:29 crc kubenswrapper[4797]: I0930 17:44:29.237422 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:29 crc kubenswrapper[4797]: I0930 17:44:29.237577 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:29 crc kubenswrapper[4797]: E0930 17:44:29.237662 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:29 crc kubenswrapper[4797]: E0930 17:44:29.237767 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:29 crc kubenswrapper[4797]: I0930 17:44:29.239169 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:44:29 crc kubenswrapper[4797]: E0930 17:44:29.239521 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g447b_openshift-ovn-kubernetes(4c749a60-66ac-44d6-955f-a3d050b12758)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" Sep 30 17:44:30 crc kubenswrapper[4797]: E0930 17:44:30.156059 4797 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 17:44:30 crc kubenswrapper[4797]: I0930 17:44:30.238144 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:30 crc kubenswrapper[4797]: I0930 17:44:30.238225 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:30 crc kubenswrapper[4797]: E0930 17:44:30.240484 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:30 crc kubenswrapper[4797]: E0930 17:44:30.240668 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:30 crc kubenswrapper[4797]: E0930 17:44:30.357833 4797 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:44:31 crc kubenswrapper[4797]: I0930 17:44:31.237396 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:31 crc kubenswrapper[4797]: I0930 17:44:31.237490 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:31 crc kubenswrapper[4797]: E0930 17:44:31.237692 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:31 crc kubenswrapper[4797]: E0930 17:44:31.237863 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:32 crc kubenswrapper[4797]: I0930 17:44:32.237917 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:32 crc kubenswrapper[4797]: E0930 17:44:32.238180 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:32 crc kubenswrapper[4797]: I0930 17:44:32.238422 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:32 crc kubenswrapper[4797]: E0930 17:44:32.238831 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:33 crc kubenswrapper[4797]: I0930 17:44:33.237853 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:33 crc kubenswrapper[4797]: I0930 17:44:33.238037 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:33 crc kubenswrapper[4797]: E0930 17:44:33.238101 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:33 crc kubenswrapper[4797]: E0930 17:44:33.238260 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:34 crc kubenswrapper[4797]: I0930 17:44:34.238032 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:34 crc kubenswrapper[4797]: I0930 17:44:34.238110 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:34 crc kubenswrapper[4797]: E0930 17:44:34.238289 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:34 crc kubenswrapper[4797]: E0930 17:44:34.238409 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:35 crc kubenswrapper[4797]: I0930 17:44:35.237345 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:35 crc kubenswrapper[4797]: I0930 17:44:35.237381 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:35 crc kubenswrapper[4797]: E0930 17:44:35.237602 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:35 crc kubenswrapper[4797]: E0930 17:44:35.237751 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:35 crc kubenswrapper[4797]: E0930 17:44:35.359229 4797 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:44:36 crc kubenswrapper[4797]: I0930 17:44:36.238090 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:36 crc kubenswrapper[4797]: I0930 17:44:36.238122 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:36 crc kubenswrapper[4797]: E0930 17:44:36.238833 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:36 crc kubenswrapper[4797]: E0930 17:44:36.239073 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:37 crc kubenswrapper[4797]: I0930 17:44:37.237365 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:37 crc kubenswrapper[4797]: I0930 17:44:37.237459 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:37 crc kubenswrapper[4797]: E0930 17:44:37.237658 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:37 crc kubenswrapper[4797]: E0930 17:44:37.237753 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:38 crc kubenswrapper[4797]: I0930 17:44:38.237652 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:38 crc kubenswrapper[4797]: E0930 17:44:38.237861 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:38 crc kubenswrapper[4797]: I0930 17:44:38.238182 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:38 crc kubenswrapper[4797]: I0930 17:44:38.238697 4797 scope.go:117] "RemoveContainer" containerID="80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f" Sep 30 17:44:38 crc kubenswrapper[4797]: E0930 17:44:38.238926 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:38 crc kubenswrapper[4797]: I0930 17:44:38.995218 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/1.log" Sep 30 17:44:38 crc kubenswrapper[4797]: I0930 17:44:38.995302 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerStarted","Data":"07b055347ada1770a801c814e8e17d3c951e96d78b4a341ba336bae8089ce020"} Sep 30 17:44:39 crc kubenswrapper[4797]: I0930 17:44:39.237721 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:39 crc kubenswrapper[4797]: E0930 17:44:39.238197 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:39 crc kubenswrapper[4797]: I0930 17:44:39.237814 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:39 crc kubenswrapper[4797]: E0930 17:44:39.238626 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:40 crc kubenswrapper[4797]: I0930 17:44:40.237765 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:40 crc kubenswrapper[4797]: I0930 17:44:40.237837 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:40 crc kubenswrapper[4797]: E0930 17:44:40.239393 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:40 crc kubenswrapper[4797]: E0930 17:44:40.239625 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:40 crc kubenswrapper[4797]: E0930 17:44:40.361082 4797 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:44:41 crc kubenswrapper[4797]: I0930 17:44:41.237371 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:41 crc kubenswrapper[4797]: I0930 17:44:41.237481 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:41 crc kubenswrapper[4797]: E0930 17:44:41.237610 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:41 crc kubenswrapper[4797]: E0930 17:44:41.237744 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:42 crc kubenswrapper[4797]: I0930 17:44:42.238024 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:42 crc kubenswrapper[4797]: I0930 17:44:42.238030 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:42 crc kubenswrapper[4797]: E0930 17:44:42.238214 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:42 crc kubenswrapper[4797]: E0930 17:44:42.238476 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:42 crc kubenswrapper[4797]: I0930 17:44:42.239221 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.019486 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/3.log" Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.022545 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerStarted","Data":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.023140 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.237251 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:43 crc kubenswrapper[4797]: E0930 17:44:43.237421 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.237675 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:43 crc kubenswrapper[4797]: E0930 17:44:43.237734 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.399201 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podStartSLOduration=112.399175075 podStartE2EDuration="1m52.399175075s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:43.071093893 +0000 UTC m=+133.593593191" watchObservedRunningTime="2025-09-30 17:44:43.399175075 +0000 UTC m=+133.921674323" Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.400600 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rx9f5"] Sep 30 17:44:43 crc kubenswrapper[4797]: I0930 17:44:43.400724 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:43 crc kubenswrapper[4797]: E0930 17:44:43.400825 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:44 crc kubenswrapper[4797]: I0930 17:44:44.237928 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:44 crc kubenswrapper[4797]: E0930 17:44:44.238120 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:45 crc kubenswrapper[4797]: I0930 17:44:45.237543 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:45 crc kubenswrapper[4797]: E0930 17:44:45.238031 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:45 crc kubenswrapper[4797]: I0930 17:44:45.237631 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:45 crc kubenswrapper[4797]: E0930 17:44:45.238146 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:45 crc kubenswrapper[4797]: I0930 17:44:45.237581 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:45 crc kubenswrapper[4797]: E0930 17:44:45.238234 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:45 crc kubenswrapper[4797]: E0930 17:44:45.363048 4797 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:44:46 crc kubenswrapper[4797]: I0930 17:44:46.237402 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:46 crc kubenswrapper[4797]: E0930 17:44:46.237563 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:47 crc kubenswrapper[4797]: I0930 17:44:47.237568 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:47 crc kubenswrapper[4797]: I0930 17:44:47.237602 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:47 crc kubenswrapper[4797]: E0930 17:44:47.237749 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:47 crc kubenswrapper[4797]: I0930 17:44:47.237836 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:47 crc kubenswrapper[4797]: E0930 17:44:47.237947 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:47 crc kubenswrapper[4797]: E0930 17:44:47.238356 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:48 crc kubenswrapper[4797]: I0930 17:44:48.237805 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:48 crc kubenswrapper[4797]: E0930 17:44:48.238061 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:49 crc kubenswrapper[4797]: I0930 17:44:49.237649 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:49 crc kubenswrapper[4797]: I0930 17:44:49.237735 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:49 crc kubenswrapper[4797]: I0930 17:44:49.237655 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:49 crc kubenswrapper[4797]: E0930 17:44:49.237876 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:44:49 crc kubenswrapper[4797]: E0930 17:44:49.238026 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:44:49 crc kubenswrapper[4797]: E0930 17:44:49.238147 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rx9f5" podUID="d2fc9be6-9aff-4e05-aadc-5a81cbfea32e" Sep 30 17:44:50 crc kubenswrapper[4797]: I0930 17:44:50.238003 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:50 crc kubenswrapper[4797]: E0930 17:44:50.239884 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.237087 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.237165 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.237302 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.242204 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.242515 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.242613 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 17:44:51 crc kubenswrapper[4797]: I0930 17:44:51.242788 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 17:44:52 crc kubenswrapper[4797]: I0930 17:44:52.237707 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:52 crc kubenswrapper[4797]: I0930 17:44:52.240892 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 17:44:52 crc kubenswrapper[4797]: I0930 17:44:52.241266 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 17:44:53 crc kubenswrapper[4797]: I0930 17:44:53.654463 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.504325 4797 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.556973 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8p4lg"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.557869 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.565230 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.565299 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.565592 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.566498 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.567108 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.567368 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.574279 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.594913 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sx5xp"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.596151 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.596916 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.597411 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.595611 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.605527 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5tgkt"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.606242 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ngfnz"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.606393 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.606818 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.611867 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.612641 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.612706 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.613180 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.615661 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.616468 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.617041 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.621768 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.622045 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.631248 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.631586 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.632007 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.632161 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.632309 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.632550 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.632692 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.632945 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.633078 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.633205 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.633343 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.637622 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.637799 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.637980 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.638123 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.638282 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650813 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-oauth-config\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650850 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/af6cfc48-04fd-45f2-b891-4a9d6a484e26-kube-api-access-cp4r2\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650874 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2bz7\" (UniqueName: \"kubernetes.io/projected/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-kube-api-access-s2bz7\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650894 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a736d1e6-e7c0-42ca-b7c4-8717df579416-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650911 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-client-ca\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650925 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-config\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650939 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e295c6d-6b86-4912-a202-31b22488bd57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-d2qqd\" (UID: \"9e295c6d-6b86-4912-a202-31b22488bd57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650952 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-serving-cert\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650965 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-audit\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650977 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-images\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.650993 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-serving-cert\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-image-import-ca\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651022 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-encryption-config\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651036 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-auth-proxy-config\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651050 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-config\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651064 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf854\" (UniqueName: \"kubernetes.io/projected/a736d1e6-e7c0-42ca-b7c4-8717df579416-kube-api-access-mf854\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651080 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-service-ca\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651095 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mq69\" (UniqueName: \"kubernetes.io/projected/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-kube-api-access-9mq69\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651108 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-encryption-config\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651120 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651136 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651150 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-config\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651172 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a736d1e6-e7c0-42ca-b7c4-8717df579416-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651187 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-machine-approver-tls\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-trusted-ca-bundle\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651214 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-serving-cert\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651226 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-audit-policies\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651246 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-oauth-serving-cert\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651262 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651282 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-etcd-serving-ca\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651296 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-serving-cert\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651311 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltsfz\" (UniqueName: \"kubernetes.io/projected/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-kube-api-access-ltsfz\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651325 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9hrq\" (UniqueName: \"kubernetes.io/projected/9e295c6d-6b86-4912-a202-31b22488bd57-kube-api-access-l9hrq\") pod \"cluster-samples-operator-665b6dd947-d2qqd\" (UID: \"9e295c6d-6b86-4912-a202-31b22488bd57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651338 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-config\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651352 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdwjx\" (UniqueName: \"kubernetes.io/projected/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-kube-api-access-bdwjx\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af6cfc48-04fd-45f2-b891-4a9d6a484e26-audit-dir\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651388 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hljd\" (UniqueName: \"kubernetes.io/projected/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-kube-api-access-7hljd\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651402 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-node-pullsecrets\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651418 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-audit-dir\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651449 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651465 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-service-ca-bundle\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651480 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-etcd-client\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651550 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651590 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-config\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-etcd-client\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651653 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-config\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.651702 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8chm\" (UniqueName: \"kubernetes.io/projected/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-kube-api-access-m8chm\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.657498 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dhl6s"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.657942 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jgd8l"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.658224 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.658504 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gh7d7"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.658603 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.659877 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.666417 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.666672 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.668526 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4t9nl"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.669049 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9dt8d"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.669583 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kpfg6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.669602 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.669643 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.670017 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.672285 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.672320 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.672922 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.673215 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.673570 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.675219 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.675641 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676082 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676352 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676818 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.677122 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676362 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.677198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.677261 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676394 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676400 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676591 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676599 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676688 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.676833 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.678064 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.678144 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.678547 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.679450 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.679976 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681029 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681248 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681449 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681461 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681681 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681779 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681856 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.681925 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.682018 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.682083 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.682183 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.686344 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ntb2"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.686523 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.686879 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.687103 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.687191 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.687535 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhrrh"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.687938 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.689310 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.689725 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.698384 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.698533 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.698554 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.698724 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.698783 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.698849 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.705943 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.707161 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.709081 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.709543 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.710522 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.710343 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.710903 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.711004 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.711193 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.712374 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.712444 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.713182 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.714117 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.719336 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.719574 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.719646 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.719783 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.719853 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.719857 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720025 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720128 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720353 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720463 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720500 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720736 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.720840 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.721307 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.721617 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.721748 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.721832 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.722174 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.722482 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.722851 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.725350 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.725507 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.725589 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.725888 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-shscm"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.726155 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.726477 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.727200 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.727376 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.728562 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.730124 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.730747 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8p4lg"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.730773 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.731182 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.731354 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.731924 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.732028 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.732113 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.732228 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.732696 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.733238 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.733670 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jzbcv"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.734202 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.735252 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.735998 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.741050 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.741571 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.742296 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.742504 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.743073 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.744419 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.744618 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.746142 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.755118 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5tgkt"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.755237 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.755413 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wgdps"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.756576 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.758976 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sx5xp"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760513 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af6cfc48-04fd-45f2-b891-4a9d6a484e26-audit-dir\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-node-pullsecrets\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760576 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0be683a6-13c1-4895-a59e-b5a7337633b4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760600 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-service-ca-bundle\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760621 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvvp2\" (UniqueName: \"kubernetes.io/projected/bd80c06b-a7d1-4c11-842c-4dd4c5000b31-kube-api-access-pvvp2\") pod \"multus-admission-controller-857f4d67dd-6ntb2\" (UID: \"bd80c06b-a7d1-4c11-842c-4dd4c5000b31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760639 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2859bf41-057a-48e9-b94f-3be799ab24ee-serving-cert\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760657 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2859bf41-057a-48e9-b94f-3be799ab24ee-config\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760677 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-etcd-client\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760695 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be683a6-13c1-4895-a59e-b5a7337633b4-serving-cert\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760714 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-ca\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760732 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-config\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760751 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-etcd-client\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760771 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npl67\" (UniqueName: \"kubernetes.io/projected/801fce17-d3ba-4444-888a-872baf5698ca-kube-api-access-npl67\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760787 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760805 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801fce17-d3ba-4444-888a-872baf5698ca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760823 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71c458bf-7891-460b-957c-73a3f4aa15f6-serving-cert\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760841 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760857 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/af6cfc48-04fd-45f2-b891-4a9d6a484e26-kube-api-access-cp4r2\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760883 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e446878-fbd1-4260-9fe9-f04f6f13172d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760911 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-config\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760937 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-config\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760959 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e295c6d-6b86-4912-a202-31b22488bd57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-d2qqd\" (UID: \"9e295c6d-6b86-4912-a202-31b22488bd57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760976 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-serving-cert\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.760995 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-serving-cert\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.761024 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-image-import-ca\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.762903 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af6cfc48-04fd-45f2-b891-4a9d6a484e26-audit-dir\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.762957 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-node-pullsecrets\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.763824 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.764075 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-service-ca-bundle\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.764539 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-config\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.764680 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-config\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765654 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765701 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v567k\" (UniqueName: \"kubernetes.io/projected/2859bf41-057a-48e9-b94f-3be799ab24ee-kube-api-access-v567k\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765793 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-auth-proxy-config\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-config\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765848 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.765870 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02649832-369a-4e11-bf3d-ad7d522629b1-proxy-tls\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.766052 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.766074 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ba2242-1b56-498e-9b09-24429a43d24e-config\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.766094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02649832-369a-4e11-bf3d-ad7d522629b1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.766210 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1531e53-84bf-4286-8341-10b43cafdeb6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.766232 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061a6e3e-616c-440a-9db7-3332d1c11361-serving-cert\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.766248 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6f8\" (UniqueName: \"kubernetes.io/projected/58496a63-6105-4e38-b1b0-f91a7276121e-kube-api-access-qt6f8\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767140 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd80c06b-a7d1-4c11-842c-4dd4c5000b31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ntb2\" (UID: \"bd80c06b-a7d1-4c11-842c-4dd4c5000b31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767170 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-machine-approver-tls\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767199 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-service-ca\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767239 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-trusted-ca-bundle\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767262 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fce17-d3ba-4444-888a-872baf5698ca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767286 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3abe72d-d1b6-4d28-aa98-b788617fefa6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767320 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767343 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3abe72d-d1b6-4d28-aa98-b788617fefa6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ba2242-1b56-498e-9b09-24429a43d24e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767398 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-etcd-serving-ca\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767420 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-serving-cert\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767473 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4370e92c-7565-4172-b6f5-4a805a338231-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.767509 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltsfz\" (UniqueName: \"kubernetes.io/projected/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-kube-api-access-ltsfz\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.768207 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2859bf41-057a-48e9-b94f-3be799ab24ee-trusted-ca\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.771074 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hljd\" (UniqueName: \"kubernetes.io/projected/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-kube-api-access-7hljd\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.771119 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3abe72d-d1b6-4d28-aa98-b788617fefa6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.771139 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbjl\" (UniqueName: \"kubernetes.io/projected/a1531e53-84bf-4286-8341-10b43cafdeb6-kube-api-access-2lbjl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.778898 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-etcd-client\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.784681 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.784941 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.770654 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-auth-proxy-config\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.785102 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.785487 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.785682 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.771161 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgfh\" (UniqueName: \"kubernetes.io/projected/061a6e3e-616c-440a-9db7-3332d1c11361-kube-api-access-ktgfh\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.791866 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.793624 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-serving-cert\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.793770 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-image-import-ca\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.793944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df7jp\" (UniqueName: \"kubernetes.io/projected/71c458bf-7891-460b-957c-73a3f4aa15f6-kube-api-access-df7jp\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.794282 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-audit-dir\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.794314 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.794373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796333 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-audit-dir\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796393 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e446878-fbd1-4260-9fe9-f04f6f13172d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796470 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796697 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796718 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796756 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzjgg\" (UniqueName: \"kubernetes.io/projected/4255c7fa-ba59-4363-ad05-cb5e7a8628e4-kube-api-access-bzjgg\") pod \"downloads-7954f5f757-dhl6s\" (UID: \"4255c7fa-ba59-4363-ad05-cb5e7a8628e4\") " pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796791 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp6n8\" (UniqueName: \"kubernetes.io/projected/0dd5e175-89ae-4e6e-9134-56d10a1974c5-kube-api-access-zp6n8\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796820 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796844 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-audit-policies\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796919 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtcr\" (UniqueName: \"kubernetes.io/projected/4370e92c-7565-4172-b6f5-4a805a338231-kube-api-access-vvtcr\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796950 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796973 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-config\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.796997 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797062 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c458bf-7891-460b-957c-73a3f4aa15f6-config\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8chm\" (UniqueName: \"kubernetes.io/projected/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-kube-api-access-m8chm\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797110 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e446878-fbd1-4260-9fe9-f04f6f13172d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797132 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4ba2242-1b56-498e-9b09-24429a43d24e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797154 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-client\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797177 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-oauth-config\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797199 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2bz7\" (UniqueName: \"kubernetes.io/projected/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-kube-api-access-s2bz7\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797275 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a736d1e6-e7c0-42ca-b7c4-8717df579416-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797293 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797317 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-client-ca\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797323 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797334 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dd5e175-89ae-4e6e-9134-56d10a1974c5-serving-cert\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797359 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-audit\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-images\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797401 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lgfr\" (UniqueName: \"kubernetes.io/projected/9e446878-fbd1-4260-9fe9-f04f6f13172d-kube-api-access-2lgfr\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797423 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-config\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797469 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4lm\" (UniqueName: \"kubernetes.io/projected/1595bfb1-9c13-4148-a2ae-075a0fb0e05b-kube-api-access-hg4lm\") pod \"control-plane-machine-set-operator-78cbb6b69f-ndhrr\" (UID: \"1595bfb1-9c13-4148-a2ae-075a0fb0e05b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797497 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-encryption-config\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797518 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx826\" (UniqueName: \"kubernetes.io/projected/71d1aee4-a761-40b4-b965-de0c111c4f2a-kube-api-access-mx826\") pod \"migrator-59844c95c7-l2fwf\" (UID: \"71d1aee4-a761-40b4-b965-de0c111c4f2a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797543 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf854\" (UniqueName: \"kubernetes.io/projected/a736d1e6-e7c0-42ca-b7c4-8717df579416-kube-api-access-mf854\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797669 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-service-ca\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797694 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a97c031e-14c5-4a4a-8bed-1e9729838102-config\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797722 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxbzr\" (UniqueName: \"kubernetes.io/projected/02649832-369a-4e11-bf3d-ad7d522629b1-kube-api-access-xxbzr\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797751 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mq69\" (UniqueName: \"kubernetes.io/projected/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-kube-api-access-9mq69\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797778 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-encryption-config\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797805 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797828 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-config\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797853 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97c031e-14c5-4a4a-8bed-1e9729838102-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797877 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c74b\" (UniqueName: \"kubernetes.io/projected/70026aa5-47c8-4f0b-a942-ca0fa33436bc-kube-api-access-9c74b\") pod \"dns-operator-744455d44c-9dt8d\" (UID: \"70026aa5-47c8-4f0b-a942-ca0fa33436bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797920 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02649832-369a-4e11-bf3d-ad7d522629b1-images\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797943 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-client-ca\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.797980 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a736d1e6-e7c0-42ca-b7c4-8717df579416-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.798004 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.798028 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1595bfb1-9c13-4148-a2ae-075a0fb0e05b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ndhrr\" (UID: \"1595bfb1-9c13-4148-a2ae-075a0fb0e05b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.798061 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-serving-cert\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.798745 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dhl6s"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.798783 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gh7d7"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.799355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-etcd-client\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.801428 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-etcd-serving-ca\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.802625 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ngfnz"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.802654 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kpfg6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.802664 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9dt8d"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.803635 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804085 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-serving-cert\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804289 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-audit-policies\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70026aa5-47c8-4f0b-a942-ca0fa33436bc-metrics-tls\") pod \"dns-operator-744455d44c-9dt8d\" (UID: \"70026aa5-47c8-4f0b-a942-ca0fa33436bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804370 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-oauth-serving-cert\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804396 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97c031e-14c5-4a4a-8bed-1e9729838102-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804420 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58496a63-6105-4e38-b1b0-f91a7276121e-audit-dir\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804466 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4370e92c-7565-4172-b6f5-4a805a338231-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804489 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804512 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804534 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4370e92c-7565-4172-b6f5-4a805a338231-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804558 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hrq\" (UniqueName: \"kubernetes.io/projected/9e295c6d-6b86-4912-a202-31b22488bd57-kube-api-access-l9hrq\") pod \"cluster-samples-operator-665b6dd947-d2qqd\" (UID: \"9e295c6d-6b86-4912-a202-31b22488bd57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804575 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-config\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804591 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdwjx\" (UniqueName: \"kubernetes.io/projected/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-kube-api-access-bdwjx\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804608 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804640 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd679\" (UniqueName: \"kubernetes.io/projected/0be683a6-13c1-4895-a59e-b5a7337633b4-kube-api-access-nd679\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.804656 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1531e53-84bf-4286-8341-10b43cafdeb6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.805180 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e295c6d-6b86-4912-a202-31b22488bd57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-d2qqd\" (UID: \"9e295c6d-6b86-4912-a202-31b22488bd57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.805446 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.805722 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jgd8l"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.805756 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mzs2t"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.806279 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.806327 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.807030 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.807363 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-config\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.807565 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-audit\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.808185 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.808182 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-images\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.808486 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.808621 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a736d1e6-e7c0-42ca-b7c4-8717df579416-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.808759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-oauth-serving-cert\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.808845 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-config\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.809342 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-service-ca\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.809352 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-config\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.809642 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-audit-policies\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.810148 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af6cfc48-04fd-45f2-b891-4a9d6a484e26-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.810976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-client-ca\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.811355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-config\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.812689 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-serving-cert\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.812757 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.812901 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-machine-approver-tls\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.813097 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-encryption-config\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.813396 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a736d1e6-e7c0-42ca-b7c4-8717df579416-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.813867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-oauth-config\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.814696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-trusted-ca-bundle\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.814766 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.815007 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ntb2"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.816052 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-serving-cert\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.816235 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.817142 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.818037 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.818986 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hvlqn"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.819783 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.819918 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af6cfc48-04fd-45f2-b891-4a9d6a484e26-encryption-config\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.824842 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.824978 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhrrh"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.827310 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4t9nl"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.828220 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-shscm"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.830151 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.831326 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.834072 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.834813 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.835998 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.837239 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.838147 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.844899 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.844972 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.845331 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.845778 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.847226 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.849878 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.850900 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.851987 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hvlqn"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.853081 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xc7x2"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.854488 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dbsss"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.854653 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.855313 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mzs2t"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.855455 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.856870 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.858469 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.859707 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.863610 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wgdps"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.864906 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.865947 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xc7x2"] Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.879569 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.900030 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905186 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1531e53-84bf-4286-8341-10b43cafdeb6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905216 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6f8\" (UniqueName: \"kubernetes.io/projected/58496a63-6105-4e38-b1b0-f91a7276121e-kube-api-access-qt6f8\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905233 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061a6e3e-616c-440a-9db7-3332d1c11361-serving-cert\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd80c06b-a7d1-4c11-842c-4dd4c5000b31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ntb2\" (UID: \"bd80c06b-a7d1-4c11-842c-4dd4c5000b31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905268 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-service-ca\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905282 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fce17-d3ba-4444-888a-872baf5698ca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3abe72d-d1b6-4d28-aa98-b788617fefa6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3abe72d-d1b6-4d28-aa98-b788617fefa6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905351 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ba2242-1b56-498e-9b09-24429a43d24e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905375 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4370e92c-7565-4172-b6f5-4a805a338231-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905424 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2859bf41-057a-48e9-b94f-3be799ab24ee-trusted-ca\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905465 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgfh\" (UniqueName: \"kubernetes.io/projected/061a6e3e-616c-440a-9db7-3332d1c11361-kube-api-access-ktgfh\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905495 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3abe72d-d1b6-4d28-aa98-b788617fefa6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905517 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbjl\" (UniqueName: \"kubernetes.io/projected/a1531e53-84bf-4286-8341-10b43cafdeb6-kube-api-access-2lbjl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905539 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df7jp\" (UniqueName: \"kubernetes.io/projected/71c458bf-7891-460b-957c-73a3f4aa15f6-kube-api-access-df7jp\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905562 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905586 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e446878-fbd1-4260-9fe9-f04f6f13172d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905611 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905637 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjgg\" (UniqueName: \"kubernetes.io/projected/4255c7fa-ba59-4363-ad05-cb5e7a8628e4-kube-api-access-bzjgg\") pod \"downloads-7954f5f757-dhl6s\" (UID: \"4255c7fa-ba59-4363-ad05-cb5e7a8628e4\") " pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905663 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp6n8\" (UniqueName: \"kubernetes.io/projected/0dd5e175-89ae-4e6e-9134-56d10a1974c5-kube-api-access-zp6n8\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-audit-policies\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905722 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtcr\" (UniqueName: \"kubernetes.io/projected/4370e92c-7565-4172-b6f5-4a805a338231-kube-api-access-vvtcr\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905747 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905768 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c458bf-7891-460b-957c-73a3f4aa15f6-config\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905797 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e446878-fbd1-4260-9fe9-f04f6f13172d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905821 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4ba2242-1b56-498e-9b09-24429a43d24e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905850 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-client\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905872 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905897 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-config\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905924 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4lm\" (UniqueName: \"kubernetes.io/projected/1595bfb1-9c13-4148-a2ae-075a0fb0e05b-kube-api-access-hg4lm\") pod \"control-plane-machine-set-operator-78cbb6b69f-ndhrr\" (UID: \"1595bfb1-9c13-4148-a2ae-075a0fb0e05b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905945 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dd5e175-89ae-4e6e-9134-56d10a1974c5-serving-cert\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905965 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lgfr\" (UniqueName: \"kubernetes.io/projected/9e446878-fbd1-4260-9fe9-f04f6f13172d-kube-api-access-2lgfr\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.905986 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx826\" (UniqueName: \"kubernetes.io/projected/71d1aee4-a761-40b4-b965-de0c111c4f2a-kube-api-access-mx826\") pod \"migrator-59844c95c7-l2fwf\" (UID: \"71d1aee4-a761-40b4-b965-de0c111c4f2a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906021 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a97c031e-14c5-4a4a-8bed-1e9729838102-config\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906045 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxbzr\" (UniqueName: \"kubernetes.io/projected/02649832-369a-4e11-bf3d-ad7d522629b1-kube-api-access-xxbzr\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906079 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02649832-369a-4e11-bf3d-ad7d522629b1-images\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906106 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97c031e-14c5-4a4a-8bed-1e9729838102-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906129 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c74b\" (UniqueName: \"kubernetes.io/projected/70026aa5-47c8-4f0b-a942-ca0fa33436bc-kube-api-access-9c74b\") pod \"dns-operator-744455d44c-9dt8d\" (UID: \"70026aa5-47c8-4f0b-a942-ca0fa33436bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906151 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906177 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-client-ca\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906207 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1595bfb1-9c13-4148-a2ae-075a0fb0e05b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ndhrr\" (UID: \"1595bfb1-9c13-4148-a2ae-075a0fb0e05b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906229 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70026aa5-47c8-4f0b-a942-ca0fa33436bc-metrics-tls\") pod \"dns-operator-744455d44c-9dt8d\" (UID: \"70026aa5-47c8-4f0b-a942-ca0fa33436bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97c031e-14c5-4a4a-8bed-1e9729838102-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906272 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4370e92c-7565-4172-b6f5-4a805a338231-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906294 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58496a63-6105-4e38-b1b0-f91a7276121e-audit-dir\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906320 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906343 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906366 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4370e92c-7565-4172-b6f5-4a805a338231-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906389 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906408 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-service-ca\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906461 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd679\" (UniqueName: \"kubernetes.io/projected/0be683a6-13c1-4895-a59e-b5a7337633b4-kube-api-access-nd679\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906487 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1531e53-84bf-4286-8341-10b43cafdeb6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906521 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0be683a6-13c1-4895-a59e-b5a7337633b4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvvp2\" (UniqueName: \"kubernetes.io/projected/bd80c06b-a7d1-4c11-842c-4dd4c5000b31-kube-api-access-pvvp2\") pod \"multus-admission-controller-857f4d67dd-6ntb2\" (UID: \"bd80c06b-a7d1-4c11-842c-4dd4c5000b31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906570 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2859bf41-057a-48e9-b94f-3be799ab24ee-serving-cert\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906579 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2859bf41-057a-48e9-b94f-3be799ab24ee-config\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906646 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be683a6-13c1-4895-a59e-b5a7337633b4-serving-cert\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906675 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-ca\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906698 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npl67\" (UniqueName: \"kubernetes.io/projected/801fce17-d3ba-4444-888a-872baf5698ca-kube-api-access-npl67\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906719 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906770 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801fce17-d3ba-4444-888a-872baf5698ca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906797 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71c458bf-7891-460b-957c-73a3f4aa15f6-serving-cert\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906820 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906841 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e446878-fbd1-4260-9fe9-f04f6f13172d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906870 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-config\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906891 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906915 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v567k\" (UniqueName: \"kubernetes.io/projected/2859bf41-057a-48e9-b94f-3be799ab24ee-kube-api-access-v567k\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906966 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02649832-369a-4e11-bf3d-ad7d522629b1-proxy-tls\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.906991 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ba2242-1b56-498e-9b09-24429a43d24e-config\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907013 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02649832-369a-4e11-bf3d-ad7d522629b1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907050 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58496a63-6105-4e38-b1b0-f91a7276121e-audit-dir\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907075 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2859bf41-057a-48e9-b94f-3be799ab24ee-trusted-ca\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907231 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907426 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2859bf41-057a-48e9-b94f-3be799ab24ee-config\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907824 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0be683a6-13c1-4895-a59e-b5a7337633b4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.907903 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fce17-d3ba-4444-888a-872baf5698ca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.908303 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02649832-369a-4e11-bf3d-ad7d522629b1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.908577 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4370e92c-7565-4172-b6f5-4a805a338231-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.908580 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-ca\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.908844 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.908912 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061a6e3e-616c-440a-9db7-3332d1c11361-serving-cert\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.909028 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-audit-policies\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.909049 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.909950 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.910178 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/061a6e3e-616c-440a-9db7-3332d1c11361-etcd-client\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.910813 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061a6e3e-616c-440a-9db7-3332d1c11361-config\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.911460 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-client-ca\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.911889 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4370e92c-7565-4172-b6f5-4a805a338231-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.912361 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.912366 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-config\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.912398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.913077 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.913095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.914948 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.915723 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dd5e175-89ae-4e6e-9134-56d10a1974c5-serving-cert\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.915770 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.916182 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801fce17-d3ba-4444-888a-872baf5698ca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.916869 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2859bf41-057a-48e9-b94f-3be799ab24ee-serving-cert\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.917328 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70026aa5-47c8-4f0b-a942-ca0fa33436bc-metrics-tls\") pod \"dns-operator-744455d44c-9dt8d\" (UID: \"70026aa5-47c8-4f0b-a942-ca0fa33436bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.919225 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.919966 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.932209 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e446878-fbd1-4260-9fe9-f04f6f13172d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.940051 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.960036 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.980710 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.992669 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be683a6-13c1-4895-a59e-b5a7337633b4-serving-cert\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:54 crc kubenswrapper[4797]: I0930 17:44:54.999868 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.029152 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.031513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e446878-fbd1-4260-9fe9-f04f6f13172d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.039774 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.047979 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02649832-369a-4e11-bf3d-ad7d522629b1-images\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.061114 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.080120 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.084965 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02649832-369a-4e11-bf3d-ad7d522629b1-proxy-tls\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.100087 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.119727 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.140337 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.160826 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.181424 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.199951 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.214021 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71c458bf-7891-460b-957c-73a3f4aa15f6-serving-cert\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.220467 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.228408 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c458bf-7891-460b-957c-73a3f4aa15f6-config\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.241063 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.260529 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.281545 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.291003 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd80c06b-a7d1-4c11-842c-4dd4c5000b31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ntb2\" (UID: \"bd80c06b-a7d1-4c11-842c-4dd4c5000b31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.300521 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.321054 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.341399 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.360170 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.391189 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.400814 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.421274 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.436647 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97c031e-14c5-4a4a-8bed-1e9729838102-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.441082 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.459799 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.461956 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a97c031e-14c5-4a4a-8bed-1e9729838102-config\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.481588 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.501031 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.520338 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.535294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4ba2242-1b56-498e-9b09-24429a43d24e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.540804 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.543301 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ba2242-1b56-498e-9b09-24429a43d24e-config\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.561168 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.581904 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.601554 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.620926 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.630754 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3abe72d-d1b6-4d28-aa98-b788617fefa6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.640831 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.646330 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3abe72d-d1b6-4d28-aa98-b788617fefa6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.660073 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.668631 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1531e53-84bf-4286-8341-10b43cafdeb6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.683078 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.700980 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.709468 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1531e53-84bf-4286-8341-10b43cafdeb6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.719685 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.739132 4797 request.go:700] Waited for 1.009455038s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.741181 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.760072 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.780303 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.786646 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1595bfb1-9c13-4148-a2ae-075a0fb0e05b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ndhrr\" (UID: \"1595bfb1-9c13-4148-a2ae-075a0fb0e05b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.800182 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.820895 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.840589 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.861749 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.880913 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.900868 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.921077 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.944501 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.960727 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 17:44:55 crc kubenswrapper[4797]: I0930 17:44:55.981395 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.000272 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.020826 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.040550 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.060838 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.082278 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.141168 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.161615 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.180993 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.200853 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.220409 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.244514 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.261045 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.281278 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.300698 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.340665 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.347202 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/af6cfc48-04fd-45f2-b891-4a9d6a484e26-kube-api-access-cp4r2\") pod \"apiserver-7bbb656c7d-xqsxf\" (UID: \"af6cfc48-04fd-45f2-b891-4a9d6a484e26\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.349664 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.361707 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.407222 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltsfz\" (UniqueName: \"kubernetes.io/projected/95ff3b88-c7c3-4e2f-abc3-8026bb941a37-kube-api-access-ltsfz\") pod \"machine-approver-56656f9798-wk4t6\" (UID: \"95ff3b88-c7c3-4e2f-abc3-8026bb941a37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.429261 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hljd\" (UniqueName: \"kubernetes.io/projected/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-kube-api-access-7hljd\") pod \"route-controller-manager-6576b87f9c-2s4jq\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.446989 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf854\" (UniqueName: \"kubernetes.io/projected/a736d1e6-e7c0-42ca-b7c4-8717df579416-kube-api-access-mf854\") pod \"openshift-controller-manager-operator-756b6f6bc6-6cgs6\" (UID: \"a736d1e6-e7c0-42ca-b7c4-8717df579416\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.463360 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.477668 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2bz7\" (UniqueName: \"kubernetes.io/projected/c0adfd2b-83f6-41ea-beee-cd0a5ac3973b-kube-api-access-s2bz7\") pod \"machine-api-operator-5694c8668f-8p4lg\" (UID: \"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.481282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.482874 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.500980 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.542585 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.553090 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mq69\" (UniqueName: \"kubernetes.io/projected/8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3-kube-api-access-9mq69\") pod \"authentication-operator-69f744f599-5tgkt\" (UID: \"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.571976 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.589407 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9hrq\" (UniqueName: \"kubernetes.io/projected/9e295c6d-6b86-4912-a202-31b22488bd57-kube-api-access-l9hrq\") pod \"cluster-samples-operator-665b6dd947-d2qqd\" (UID: \"9e295c6d-6b86-4912-a202-31b22488bd57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.608532 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8chm\" (UniqueName: \"kubernetes.io/projected/ebfcdd10-b87b-4d46-9bea-3d1b98273a28-kube-api-access-m8chm\") pod \"apiserver-76f77b778f-sx5xp\" (UID: \"ebfcdd10-b87b-4d46-9bea-3d1b98273a28\") " pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.621567 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.624962 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdwjx\" (UniqueName: \"kubernetes.io/projected/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-kube-api-access-bdwjx\") pod \"console-f9d7485db-ngfnz\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.626383 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.633203 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.639105 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.641699 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.646898 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf"] Sep 30 17:44:56 crc kubenswrapper[4797]: W0930 17:44:56.658373 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf6cfc48_04fd_45f2_b891_4a9d6a484e26.slice/crio-4178538a78495bb22e1b6c85ac887acbc43f1f19c455d8b6ede6ed0ea53dcad9 WatchSource:0}: Error finding container 4178538a78495bb22e1b6c85ac887acbc43f1f19c455d8b6ede6ed0ea53dcad9: Status 404 returned error can't find the container with id 4178538a78495bb22e1b6c85ac887acbc43f1f19c455d8b6ede6ed0ea53dcad9 Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.661860 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.684617 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.696023 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.701312 4797 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.709717 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.720896 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.741681 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.758348 4797 request.go:700] Waited for 1.902617779s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.760129 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.780069 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.821056 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.837450 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6f8\" (UniqueName: \"kubernetes.io/projected/58496a63-6105-4e38-b1b0-f91a7276121e-kube-api-access-qt6f8\") pod \"oauth-openshift-558db77b4-kpfg6\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.842839 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3abe72d-d1b6-4d28-aa98-b788617fefa6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zdgmq\" (UID: \"c3abe72d-d1b6-4d28-aa98-b788617fefa6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.878376 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ba2242-1b56-498e-9b09-24429a43d24e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zd772\" (UID: \"c4ba2242-1b56-498e-9b09-24429a43d24e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.886244 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbjl\" (UniqueName: \"kubernetes.io/projected/a1531e53-84bf-4286-8341-10b43cafdeb6-kube-api-access-2lbjl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6kblt\" (UID: \"a1531e53-84bf-4286-8341-10b43cafdeb6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.899284 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtcr\" (UniqueName: \"kubernetes.io/projected/4370e92c-7565-4172-b6f5-4a805a338231-kube-api-access-vvtcr\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.914307 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e446878-fbd1-4260-9fe9-f04f6f13172d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.939002 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgfh\" (UniqueName: \"kubernetes.io/projected/061a6e3e-616c-440a-9db7-3332d1c11361-kube-api-access-ktgfh\") pod \"etcd-operator-b45778765-4t9nl\" (UID: \"061a6e3e-616c-440a-9db7-3332d1c11361\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.957276 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df7jp\" (UniqueName: \"kubernetes.io/projected/71c458bf-7891-460b-957c-73a3f4aa15f6-kube-api-access-df7jp\") pod \"service-ca-operator-777779d784-8nsxk\" (UID: \"71c458bf-7891-460b-957c-73a3f4aa15f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.976992 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.981327 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp6n8\" (UniqueName: \"kubernetes.io/projected/0dd5e175-89ae-4e6e-9134-56d10a1974c5-kube-api-access-zp6n8\") pod \"controller-manager-879f6c89f-jgd8l\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.983094 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" Sep 30 17:44:56 crc kubenswrapper[4797]: I0930 17:44:56.984395 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5tgkt"] Sep 30 17:44:57 crc kubenswrapper[4797]: W0930 17:44:57.002847 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd6a13f_ad8b_4c25_828a_7ae6ae240ea3.slice/crio-2c49672cd769fe75097b33512c5325277a35da1114333f3936cdf428e391d346 WatchSource:0}: Error finding container 2c49672cd769fe75097b33512c5325277a35da1114333f3936cdf428e391d346: Status 404 returned error can't find the container with id 2c49672cd769fe75097b33512c5325277a35da1114333f3936cdf428e391d346 Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.003415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npl67\" (UniqueName: \"kubernetes.io/projected/801fce17-d3ba-4444-888a-872baf5698ca-kube-api-access-npl67\") pod \"openshift-apiserver-operator-796bbdcf4f-fbnc4\" (UID: \"801fce17-d3ba-4444-888a-872baf5698ca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.005115 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8p4lg"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.019086 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzjgg\" (UniqueName: \"kubernetes.io/projected/4255c7fa-ba59-4363-ad05-cb5e7a8628e4-kube-api-access-bzjgg\") pod \"downloads-7954f5f757-dhl6s\" (UID: \"4255c7fa-ba59-4363-ad05-cb5e7a8628e4\") " pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.033456 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.038024 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.038421 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd679\" (UniqueName: \"kubernetes.io/projected/0be683a6-13c1-4895-a59e-b5a7337633b4-kube-api-access-nd679\") pod \"openshift-config-operator-7777fb866f-7n6zc\" (UID: \"0be683a6-13c1-4895-a59e-b5a7337633b4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.053294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97c031e-14c5-4a4a-8bed-1e9729838102-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rmh7z\" (UID: \"a97c031e-14c5-4a4a-8bed-1e9729838102\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.059174 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" Sep 30 17:44:57 crc kubenswrapper[4797]: W0930 17:44:57.061766 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0adfd2b_83f6_41ea_beee_cd0a5ac3973b.slice/crio-e6eb986a6dc655b36737330818d9c009763672199c870c3a1268d3c2508de21d WatchSource:0}: Error finding container e6eb986a6dc655b36737330818d9c009763672199c870c3a1268d3c2508de21d: Status 404 returned error can't find the container with id e6eb986a6dc655b36737330818d9c009763672199c870c3a1268d3c2508de21d Sep 30 17:44:57 crc kubenswrapper[4797]: W0930 17:44:57.062544 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1ebf89_17d5_4811_9ee9_05d7e50441e9.slice/crio-eead655a5922a5adab995cd3006e8a42bc185cea8d09bbdf1797e8c1a0acce6f WatchSource:0}: Error finding container eead655a5922a5adab995cd3006e8a42bc185cea8d09bbdf1797e8c1a0acce6f: Status 404 returned error can't find the container with id eead655a5922a5adab995cd3006e8a42bc185cea8d09bbdf1797e8c1a0acce6f Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.066471 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.072821 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.074895 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c74b\" (UniqueName: \"kubernetes.io/projected/70026aa5-47c8-4f0b-a942-ca0fa33436bc-kube-api-access-9c74b\") pod \"dns-operator-744455d44c-9dt8d\" (UID: \"70026aa5-47c8-4f0b-a942-ca0fa33436bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.076161 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sx5xp"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.083964 4797 generic.go:334] "Generic (PLEG): container finished" podID="af6cfc48-04fd-45f2-b891-4a9d6a484e26" containerID="2f28b625cbe0aac285bf4c357842c282778175e9ae0a2df5a739eaf5eaaec995" exitCode=0 Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.084008 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" event={"ID":"af6cfc48-04fd-45f2-b891-4a9d6a484e26","Type":"ContainerDied","Data":"2f28b625cbe0aac285bf4c357842c282778175e9ae0a2df5a739eaf5eaaec995"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.084030 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" event={"ID":"af6cfc48-04fd-45f2-b891-4a9d6a484e26","Type":"ContainerStarted","Data":"4178538a78495bb22e1b6c85ac887acbc43f1f19c455d8b6ede6ed0ea53dcad9"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.089651 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" event={"ID":"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3","Type":"ContainerStarted","Data":"2c49672cd769fe75097b33512c5325277a35da1114333f3936cdf428e391d346"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.090471 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" event={"ID":"8d1ebf89-17d5-4811-9ee9-05d7e50441e9","Type":"ContainerStarted","Data":"eead655a5922a5adab995cd3006e8a42bc185cea8d09bbdf1797e8c1a0acce6f"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.092765 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" event={"ID":"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b","Type":"ContainerStarted","Data":"e6eb986a6dc655b36737330818d9c009763672199c870c3a1268d3c2508de21d"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.096611 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvvp2\" (UniqueName: \"kubernetes.io/projected/bd80c06b-a7d1-4c11-842c-4dd4c5000b31-kube-api-access-pvvp2\") pod \"multus-admission-controller-857f4d67dd-6ntb2\" (UID: \"bd80c06b-a7d1-4c11-842c-4dd4c5000b31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.098613 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" event={"ID":"95ff3b88-c7c3-4e2f-abc3-8026bb941a37","Type":"ContainerStarted","Data":"d763f266baec66a83b00313f31ba5643b5511f9b39ab2141d5b90525e185935c"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.098650 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" event={"ID":"95ff3b88-c7c3-4e2f-abc3-8026bb941a37","Type":"ContainerStarted","Data":"b671b1de8b6d434b0febe4999d9601fb7a920187122b45368d8272925f1a5f25"} Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.117243 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v567k\" (UniqueName: \"kubernetes.io/projected/2859bf41-057a-48e9-b94f-3be799ab24ee-kube-api-access-v567k\") pod \"console-operator-58897d9998-gh7d7\" (UID: \"2859bf41-057a-48e9-b94f-3be799ab24ee\") " pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.130673 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.131673 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.136737 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ngfnz"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.139095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx826\" (UniqueName: \"kubernetes.io/projected/71d1aee4-a761-40b4-b965-de0c111c4f2a-kube-api-access-mx826\") pod \"migrator-59844c95c7-l2fwf\" (UID: \"71d1aee4-a761-40b4-b965-de0c111c4f2a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.140864 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.160455 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lgfr\" (UniqueName: \"kubernetes.io/projected/9e446878-fbd1-4260-9fe9-f04f6f13172d-kube-api-access-2lgfr\") pod \"ingress-operator-5b745b69d9-rk7f6\" (UID: \"9e446878-fbd1-4260-9fe9-f04f6f13172d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.186133 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4370e92c-7565-4172-b6f5-4a805a338231-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s5vh7\" (UID: \"4370e92c-7565-4172-b6f5-4a805a338231\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.204603 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxbzr\" (UniqueName: \"kubernetes.io/projected/02649832-369a-4e11-bf3d-ad7d522629b1-kube-api-access-xxbzr\") pod \"machine-config-operator-74547568cd-frbm9\" (UID: \"02649832-369a-4e11-bf3d-ad7d522629b1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.220944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4lm\" (UniqueName: \"kubernetes.io/projected/1595bfb1-9c13-4148-a2ae-075a0fb0e05b-kube-api-access-hg4lm\") pod \"control-plane-machine-set-operator-78cbb6b69f-ndhrr\" (UID: \"1595bfb1-9c13-4148-a2ae-075a0fb0e05b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245401 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-default-certificate\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245507 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245554 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2vxl\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-kube-api-access-k2vxl\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245593 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245619 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245669 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245698 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-tls\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.246510 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:57.746494056 +0000 UTC m=+148.268993294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.245747 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-certificates\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248013 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248045 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hslk\" (UniqueName: \"kubernetes.io/projected/cf3e5bac-0992-4ba7-9899-44d44f898977-kube-api-access-8hslk\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248066 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248081 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-stats-auth\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248105 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-metrics-certs\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248163 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-trusted-ca\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248185 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88sg\" (UniqueName: \"kubernetes.io/projected/03a9880c-d077-47c8-b93a-d96cf7dced9c-kube-api-access-q88sg\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248204 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-srv-cert\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248242 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwgl\" (UniqueName: \"kubernetes.io/projected/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-kube-api-access-qzwgl\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248264 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-bound-sa-token\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.248293 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf3e5bac-0992-4ba7-9899-44d44f898977-service-ca-bundle\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.255063 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kpfg6"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.257070 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.264353 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.270489 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:57 crc kubenswrapper[4797]: W0930 17:44:57.286481 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58496a63_6105_4e38_b1b0_f91a7276121e.slice/crio-0e732097a43e1d93ccbd71d6b62b602854262d959f2de5927dec61cce7ca8aac WatchSource:0}: Error finding container 0e732097a43e1d93ccbd71d6b62b602854262d959f2de5927dec61cce7ca8aac: Status 404 returned error can't find the container with id 0e732097a43e1d93ccbd71d6b62b602854262d959f2de5927dec61cce7ca8aac Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.290782 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.296130 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.303059 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.317943 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.326293 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.332190 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4t9nl"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.340235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351701 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351864 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqmgn\" (UniqueName: \"kubernetes.io/projected/e6ccf165-5613-42ca-bcc3-6c44c9db369a-kube-api-access-zqmgn\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351901 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hslk\" (UniqueName: \"kubernetes.io/projected/cf3e5bac-0992-4ba7-9899-44d44f898977-kube-api-access-8hslk\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351919 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351934 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-stats-auth\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351950 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2pv\" (UniqueName: \"kubernetes.io/projected/8b5e8fb3-229c-474b-86e8-d52c08555c5e-kube-api-access-7b2pv\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.351981 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b586a7c7-c606-496d-8838-9cdf9790d4d0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5rg4c\" (UID: \"b586a7c7-c606-496d-8838-9cdf9790d4d0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352000 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpphp\" (UniqueName: \"kubernetes.io/projected/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-kube-api-access-lpphp\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352100 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-metrics-certs\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352174 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6ccf165-5613-42ca-bcc3-6c44c9db369a-apiservice-cert\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352209 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6ccf165-5613-42ca-bcc3-6c44c9db369a-webhook-cert\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352254 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86hxv\" (UniqueName: \"kubernetes.io/projected/b789d400-5446-4d1e-9b8a-c563b150668d-kube-api-access-86hxv\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352269 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b789d400-5446-4d1e-9b8a-c563b150668d-profile-collector-cert\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352322 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-trusted-ca\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352366 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b5e8fb3-229c-474b-86e8-d52c08555c5e-metrics-tls\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352388 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b5e8fb3-229c-474b-86e8-d52c08555c5e-config-volume\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352408 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88sg\" (UniqueName: \"kubernetes.io/projected/03a9880c-d077-47c8-b93a-d96cf7dced9c-kube-api-access-q88sg\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352424 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-srv-cert\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352456 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-registration-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352517 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a534e74-44b0-4953-92bd-97a47ea3e2be-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352620 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwgl\" (UniqueName: \"kubernetes.io/projected/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-kube-api-access-qzwgl\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352657 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-bound-sa-token\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352675 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bvh\" (UniqueName: \"kubernetes.io/projected/0cea082a-65b0-457d-8418-0c6247118e92-kube-api-access-54bvh\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352715 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be38c1f4-08a9-4beb-8733-ef379fad6cf8-cert\") pod \"ingress-canary-mzs2t\" (UID: \"be38c1f4-08a9-4beb-8733-ef379fad6cf8\") " pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352756 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0cea082a-65b0-457d-8418-0c6247118e92-signing-cabundle\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352804 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf3e5bac-0992-4ba7-9899-44d44f898977-service-ca-bundle\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352849 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-default-certificate\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352866 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jnmm\" (UniqueName: \"kubernetes.io/projected/b65fb47f-d3bc-40dc-8352-27edf856762c-kube-api-access-8jnmm\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352892 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352907 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-socket-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352949 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2vxl\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-kube-api-access-k2vxl\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352965 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0cea082a-65b0-457d-8418-0c6247118e92-signing-key\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.352988 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-config-volume\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353002 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e6ccf165-5613-42ca-bcc3-6c44c9db369a-tmpfs\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353017 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b789d400-5446-4d1e-9b8a-c563b150668d-srv-cert\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353059 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353145 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353163 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlt25\" (UniqueName: \"kubernetes.io/projected/5a534e74-44b0-4953-92bd-97a47ea3e2be-kube-api-access-dlt25\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353180 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a534e74-44b0-4953-92bd-97a47ea3e2be-proxy-tls\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353203 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-secret-volume\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-csi-data-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353297 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-tls\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353313 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hn6b\" (UniqueName: \"kubernetes.io/projected/be38c1f4-08a9-4beb-8733-ef379fad6cf8-kube-api-access-2hn6b\") pod \"ingress-canary-mzs2t\" (UID: \"be38c1f4-08a9-4beb-8733-ef379fad6cf8\") " pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.353352 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:57.853330031 +0000 UTC m=+148.375829269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353407 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e8e273fa-fe2e-4966-9b77-2636563e3326-node-bootstrap-token\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353466 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7d8c\" (UniqueName: \"kubernetes.io/projected/e8e273fa-fe2e-4966-9b77-2636563e3326-kube-api-access-x7d8c\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353532 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-plugins-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.353947 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.355101 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.358219 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e8e273fa-fe2e-4966-9b77-2636563e3326-certs\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.358283 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-mountpoint-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.358329 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pprmn\" (UniqueName: \"kubernetes.io/projected/b586a7c7-c606-496d-8838-9cdf9790d4d0-kube-api-access-pprmn\") pod \"package-server-manager-789f6589d5-5rg4c\" (UID: \"b586a7c7-c606-496d-8838-9cdf9790d4d0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.358363 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-certificates\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.358382 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.358505 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-trusted-ca\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.359164 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf3e5bac-0992-4ba7-9899-44d44f898977-service-ca-bundle\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.360815 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.361951 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-certificates\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.363940 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.367616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.367667 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-stats-auth\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.369511 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-default-certificate\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.369873 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.373043 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf3e5bac-0992-4ba7-9899-44d44f898977-metrics-certs\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.377540 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-srv-cert\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.381738 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.385459 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-tls\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.407048 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.408456 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hslk\" (UniqueName: \"kubernetes.io/projected/cf3e5bac-0992-4ba7-9899-44d44f898977-kube-api-access-8hslk\") pod \"router-default-5444994796-jzbcv\" (UID: \"cf3e5bac-0992-4ba7-9899-44d44f898977\") " pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.419238 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88sg\" (UniqueName: \"kubernetes.io/projected/03a9880c-d077-47c8-b93a-d96cf7dced9c-kube-api-access-q88sg\") pod \"marketplace-operator-79b997595-dhrrh\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.442213 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-bound-sa-token\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.459751 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwgl\" (UniqueName: \"kubernetes.io/projected/d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530-kube-api-access-qzwgl\") pod \"olm-operator-6b444d44fb-x4crt\" (UID: \"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461464 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-csi-data-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461500 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hn6b\" (UniqueName: \"kubernetes.io/projected/be38c1f4-08a9-4beb-8733-ef379fad6cf8-kube-api-access-2hn6b\") pod \"ingress-canary-mzs2t\" (UID: \"be38c1f4-08a9-4beb-8733-ef379fad6cf8\") " pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7d8c\" (UniqueName: \"kubernetes.io/projected/e8e273fa-fe2e-4966-9b77-2636563e3326-kube-api-access-x7d8c\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-plugins-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461553 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e8e273fa-fe2e-4966-9b77-2636563e3326-node-bootstrap-token\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e8e273fa-fe2e-4966-9b77-2636563e3326-certs\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461593 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-mountpoint-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461609 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pprmn\" (UniqueName: \"kubernetes.io/projected/b586a7c7-c606-496d-8838-9cdf9790d4d0-kube-api-access-pprmn\") pod \"package-server-manager-789f6589d5-5rg4c\" (UID: \"b586a7c7-c606-496d-8838-9cdf9790d4d0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461630 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqmgn\" (UniqueName: \"kubernetes.io/projected/e6ccf165-5613-42ca-bcc3-6c44c9db369a-kube-api-access-zqmgn\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461647 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b586a7c7-c606-496d-8838-9cdf9790d4d0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5rg4c\" (UID: \"b586a7c7-c606-496d-8838-9cdf9790d4d0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461663 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpphp\" (UniqueName: \"kubernetes.io/projected/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-kube-api-access-lpphp\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461678 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2pv\" (UniqueName: \"kubernetes.io/projected/8b5e8fb3-229c-474b-86e8-d52c08555c5e-kube-api-access-7b2pv\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461704 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6ccf165-5613-42ca-bcc3-6c44c9db369a-apiservice-cert\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461719 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6ccf165-5613-42ca-bcc3-6c44c9db369a-webhook-cert\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461738 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86hxv\" (UniqueName: \"kubernetes.io/projected/b789d400-5446-4d1e-9b8a-c563b150668d-kube-api-access-86hxv\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461754 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b789d400-5446-4d1e-9b8a-c563b150668d-profile-collector-cert\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b5e8fb3-229c-474b-86e8-d52c08555c5e-metrics-tls\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461793 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b5e8fb3-229c-474b-86e8-d52c08555c5e-config-volume\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461811 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-registration-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461843 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a534e74-44b0-4953-92bd-97a47ea3e2be-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461867 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bvh\" (UniqueName: \"kubernetes.io/projected/0cea082a-65b0-457d-8418-0c6247118e92-kube-api-access-54bvh\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461884 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be38c1f4-08a9-4beb-8733-ef379fad6cf8-cert\") pod \"ingress-canary-mzs2t\" (UID: \"be38c1f4-08a9-4beb-8733-ef379fad6cf8\") " pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461898 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0cea082a-65b0-457d-8418-0c6247118e92-signing-cabundle\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461922 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jnmm\" (UniqueName: \"kubernetes.io/projected/b65fb47f-d3bc-40dc-8352-27edf856762c-kube-api-access-8jnmm\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461937 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-socket-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461947 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-csi-data-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461960 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0cea082a-65b0-457d-8418-0c6247118e92-signing-key\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461978 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-config-volume\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.461993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e6ccf165-5613-42ca-bcc3-6c44c9db369a-tmpfs\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.462013 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.462028 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b789d400-5446-4d1e-9b8a-c563b150668d-srv-cert\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.462061 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a534e74-44b0-4953-92bd-97a47ea3e2be-proxy-tls\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.462076 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlt25\" (UniqueName: \"kubernetes.io/projected/5a534e74-44b0-4953-92bd-97a47ea3e2be-kube-api-access-dlt25\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.462090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-secret-volume\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.462856 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.463363 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-socket-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.463638 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-plugins-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.463907 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:57.963896203 +0000 UTC m=+148.486395431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.464600 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-mountpoint-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.464946 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a534e74-44b0-4953-92bd-97a47ea3e2be-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.465095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b65fb47f-d3bc-40dc-8352-27edf856762c-registration-dir\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.466106 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e6ccf165-5613-42ca-bcc3-6c44c9db369a-tmpfs\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.466948 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b5e8fb3-229c-474b-86e8-d52c08555c5e-config-volume\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.468285 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0cea082a-65b0-457d-8418-0c6247118e92-signing-cabundle\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.474640 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0cea082a-65b0-457d-8418-0c6247118e92-signing-key\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.476026 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-config-volume\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.476211 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b5e8fb3-229c-474b-86e8-d52c08555c5e-metrics-tls\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.476218 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-secret-volume\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.476716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b789d400-5446-4d1e-9b8a-c563b150668d-profile-collector-cert\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.476724 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be38c1f4-08a9-4beb-8733-ef379fad6cf8-cert\") pod \"ingress-canary-mzs2t\" (UID: \"be38c1f4-08a9-4beb-8733-ef379fad6cf8\") " pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.476841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e8e273fa-fe2e-4966-9b77-2636563e3326-certs\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.477152 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b789d400-5446-4d1e-9b8a-c563b150668d-srv-cert\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.477201 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e8e273fa-fe2e-4966-9b77-2636563e3326-node-bootstrap-token\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.477367 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6ccf165-5613-42ca-bcc3-6c44c9db369a-apiservice-cert\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.486578 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b586a7c7-c606-496d-8838-9cdf9790d4d0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5rg4c\" (UID: \"b586a7c7-c606-496d-8838-9cdf9790d4d0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.487867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6ccf165-5613-42ca-bcc3-6c44c9db369a-webhook-cert\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.488782 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a534e74-44b0-4953-92bd-97a47ea3e2be-proxy-tls\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.502139 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2vxl\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-kube-api-access-k2vxl\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.525132 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7d8c\" (UniqueName: \"kubernetes.io/projected/e8e273fa-fe2e-4966-9b77-2636563e3326-kube-api-access-x7d8c\") pod \"machine-config-server-dbsss\" (UID: \"e8e273fa-fe2e-4966-9b77-2636563e3326\") " pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.537307 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hn6b\" (UniqueName: \"kubernetes.io/projected/be38c1f4-08a9-4beb-8733-ef379fad6cf8-kube-api-access-2hn6b\") pod \"ingress-canary-mzs2t\" (UID: \"be38c1f4-08a9-4beb-8733-ef379fad6cf8\") " pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.562646 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.563277 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.063261038 +0000 UTC m=+148.585760276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.575985 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqmgn\" (UniqueName: \"kubernetes.io/projected/e6ccf165-5613-42ca-bcc3-6c44c9db369a-kube-api-access-zqmgn\") pod \"packageserver-d55dfcdfc-764b6\" (UID: \"e6ccf165-5613-42ca-bcc3-6c44c9db369a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.584387 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2pv\" (UniqueName: \"kubernetes.io/projected/8b5e8fb3-229c-474b-86e8-d52c08555c5e-kube-api-access-7b2pv\") pod \"dns-default-hvlqn\" (UID: \"8b5e8fb3-229c-474b-86e8-d52c08555c5e\") " pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.608451 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.619175 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jnmm\" (UniqueName: \"kubernetes.io/projected/b65fb47f-d3bc-40dc-8352-27edf856762c-kube-api-access-8jnmm\") pod \"csi-hostpathplugin-xc7x2\" (UID: \"b65fb47f-d3bc-40dc-8352-27edf856762c\") " pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.620729 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.630876 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pprmn\" (UniqueName: \"kubernetes.io/projected/b586a7c7-c606-496d-8838-9cdf9790d4d0-kube-api-access-pprmn\") pod \"package-server-manager-789f6589d5-5rg4c\" (UID: \"b586a7c7-c606-496d-8838-9cdf9790d4d0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.645637 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.662100 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpphp\" (UniqueName: \"kubernetes.io/projected/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-kube-api-access-lpphp\") pod \"collect-profiles-29320890-q8f6x\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.663455 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bvh\" (UniqueName: \"kubernetes.io/projected/0cea082a-65b0-457d-8418-0c6247118e92-kube-api-access-54bvh\") pod \"service-ca-9c57cc56f-wgdps\" (UID: \"0cea082a-65b0-457d-8418-0c6247118e92\") " pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.664132 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.664418 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.16440557 +0000 UTC m=+148.686904808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.667284 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.677616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlt25\" (UniqueName: \"kubernetes.io/projected/5a534e74-44b0-4953-92bd-97a47ea3e2be-kube-api-access-dlt25\") pod \"machine-config-controller-84d6567774-bkdjn\" (UID: \"5a534e74-44b0-4953-92bd-97a47ea3e2be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.695729 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.698027 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86hxv\" (UniqueName: \"kubernetes.io/projected/b789d400-5446-4d1e-9b8a-c563b150668d-kube-api-access-86hxv\") pod \"catalog-operator-68c6474976-8pvns\" (UID: \"b789d400-5446-4d1e-9b8a-c563b150668d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.703240 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.709988 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.716343 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.728504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.733597 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.748147 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.758198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.765663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.766027 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.266013443 +0000 UTC m=+148.788512681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.769228 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mzs2t" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.774221 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hvlqn" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.795371 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.798092 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dbsss" Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.799170 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc"] Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.867993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.868445 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.368415525 +0000 UTC m=+148.890914763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.880721 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dhl6s"] Sep 30 17:44:57 crc kubenswrapper[4797]: W0930 17:44:57.922387 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be683a6_13c1_4895_a59e_b5a7337633b4.slice/crio-0b0c42bbdb65c65ce7ad6cad6326cd9aabb74d2e5ebc370d0a97d256e1f8fec2 WatchSource:0}: Error finding container 0b0c42bbdb65c65ce7ad6cad6326cd9aabb74d2e5ebc370d0a97d256e1f8fec2: Status 404 returned error can't find the container with id 0b0c42bbdb65c65ce7ad6cad6326cd9aabb74d2e5ebc370d0a97d256e1f8fec2 Sep 30 17:44:57 crc kubenswrapper[4797]: I0930 17:44:57.969946 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:57 crc kubenswrapper[4797]: E0930 17:44:57.970411 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.470397328 +0000 UTC m=+148.992896566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.070039 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gh7d7"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.071222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.071675 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.571659732 +0000 UTC m=+149.094158970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.074775 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.076365 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jgd8l"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.077853 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9dt8d"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.144291 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" event={"ID":"9e295c6d-6b86-4912-a202-31b22488bd57","Type":"ContainerStarted","Data":"39b9397cbffd8d4eedd347731542ebafce8d6bba527d1734a329a5585019fd7a"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.144335 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" event={"ID":"9e295c6d-6b86-4912-a202-31b22488bd57","Type":"ContainerStarted","Data":"10d4a8caa77ba632da570adacb892f9fa62f7e00ae48ee59da56332358285cdd"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.146354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" event={"ID":"8d1ebf89-17d5-4811-9ee9-05d7e50441e9","Type":"ContainerStarted","Data":"a8a009e20380cba9977a3ac9419b6a06930a6a3decc517dea6c298d42694ebe8"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.146588 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.153388 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngfnz" event={"ID":"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0","Type":"ContainerStarted","Data":"a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.153452 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngfnz" event={"ID":"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0","Type":"ContainerStarted","Data":"50419cb32ecb5f89c82a214c669506aa90c9bad7d159a6231dc58f24b6b939a6"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.156555 4797 generic.go:334] "Generic (PLEG): container finished" podID="ebfcdd10-b87b-4d46-9bea-3d1b98273a28" containerID="e7718294e9bdc59db1af330c88f17093f93d6984b96f39bf7eebe9eb2c3615c2" exitCode=0 Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.156650 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" event={"ID":"ebfcdd10-b87b-4d46-9bea-3d1b98273a28","Type":"ContainerDied","Data":"e7718294e9bdc59db1af330c88f17093f93d6984b96f39bf7eebe9eb2c3615c2"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.156670 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" event={"ID":"ebfcdd10-b87b-4d46-9bea-3d1b98273a28","Type":"ContainerStarted","Data":"23483d2c3e30a0f719221a1ed547518a70c8fbee6a11298b39c04101fb4603ad"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.156909 4797 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2s4jq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.156963 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" podUID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.163848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" event={"ID":"95ff3b88-c7c3-4e2f-abc3-8026bb941a37","Type":"ContainerStarted","Data":"612c67e513c0d3f1a1468200d4019ea5e98a382f7724fae89978bb8e78bea5f8"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.165154 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jzbcv" event={"ID":"cf3e5bac-0992-4ba7-9899-44d44f898977","Type":"ContainerStarted","Data":"5b64a397e4f700e8af8e2c6b49b940ef3cd79020a452333db39bbafc5ac7d739"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.166507 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" event={"ID":"a1531e53-84bf-4286-8341-10b43cafdeb6","Type":"ContainerStarted","Data":"eb8b24dbeac32555e5da79510e36977cfe9243f168a05f3a9d91b96612325ac6"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.171262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dhl6s" event={"ID":"4255c7fa-ba59-4363-ad05-cb5e7a8628e4","Type":"ContainerStarted","Data":"fbad5b32ed0020975051d7a00dee61f644170e91a2335e638b952733ae51b315"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.172568 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.172710 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.67269103 +0000 UTC m=+149.195190268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.175282 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" event={"ID":"a736d1e6-e7c0-42ca-b7c4-8717df579416","Type":"ContainerStarted","Data":"67dbbb7b72a0768798d519724684321a43d0719acd737202a73d1bf564603b5a"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.175561 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" event={"ID":"a736d1e6-e7c0-42ca-b7c4-8717df579416","Type":"ContainerStarted","Data":"7de4e3a0f375d7fa98a3036f8f3e11d2bbd9563d7ba8c53fb48d875896022514"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.202981 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.203263 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.703247069 +0000 UTC m=+149.225746307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.206681 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ntb2"] Sep 30 17:44:58 crc kubenswrapper[4797]: W0930 17:44:58.212212 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801fce17_d3ba_4444_888a_872baf5698ca.slice/crio-907ef9925e173ddb096c91351bceae4188a0799b66e5341dfe56bcda06274e84 WatchSource:0}: Error finding container 907ef9925e173ddb096c91351bceae4188a0799b66e5341dfe56bcda06274e84: Status 404 returned error can't find the container with id 907ef9925e173ddb096c91351bceae4188a0799b66e5341dfe56bcda06274e84 Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.219003 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" event={"ID":"8cd6a13f-ad8b-4c25-828a-7ae6ae240ea3","Type":"ContainerStarted","Data":"a479e4eb53fc6d7ab7b3f32d5c285d18252061b8128a0ee36e1630ad35486de7"} Sep 30 17:44:58 crc kubenswrapper[4797]: W0930 17:44:58.212656 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70026aa5_47c8_4f0b_a942_ca0fa33436bc.slice/crio-5276353b63fe0a31a8409529a8a58fa8ecf7e4ea00fbb529f136fc10db2f1ad5 WatchSource:0}: Error finding container 5276353b63fe0a31a8409529a8a58fa8ecf7e4ea00fbb529f136fc10db2f1ad5: Status 404 returned error can't find the container with id 5276353b63fe0a31a8409529a8a58fa8ecf7e4ea00fbb529f136fc10db2f1ad5 Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.255646 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.255778 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.255789 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" event={"ID":"c4ba2242-1b56-498e-9b09-24429a43d24e","Type":"ContainerStarted","Data":"5b0471943d414db15933e849f4a08443da0fdfaff6c0e4f60dfcb2b1943015d0"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.260031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" event={"ID":"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b","Type":"ContainerStarted","Data":"3a0c7a0a4a0f0de9bb3c6ef593ad27c33b813bfec6e6553fee7ba444f291b5bb"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.260058 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" event={"ID":"c0adfd2b-83f6-41ea-beee-cd0a5ac3973b","Type":"ContainerStarted","Data":"b5badbbdf6a0aa98f8d3feedc9b547246d306145243d119c171d91b92eefd2e6"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.302484 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" event={"ID":"58496a63-6105-4e38-b1b0-f91a7276121e","Type":"ContainerStarted","Data":"21b43fad5edf41571d16f910352ea52e7a7202fa91b343d4b3bf2bc95d175bb8"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.302523 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" event={"ID":"58496a63-6105-4e38-b1b0-f91a7276121e","Type":"ContainerStarted","Data":"0e732097a43e1d93ccbd71d6b62b602854262d959f2de5927dec61cce7ca8aac"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.303377 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.303673 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.303971 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.803952829 +0000 UTC m=+149.326452067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.304792 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.305061 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.805052507 +0000 UTC m=+149.327551735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.322836 4797 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kpfg6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.323228 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.347205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" event={"ID":"71c458bf-7891-460b-957c-73a3f4aa15f6","Type":"ContainerStarted","Data":"8fc612ff0a9c294f06931b0dcb25149fa0b472fcf5b07345fbc37cdeb578c5fc"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.349866 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" event={"ID":"71c458bf-7891-460b-957c-73a3f4aa15f6","Type":"ContainerStarted","Data":"2ab27032681bfa5345fac51f98310d6a8c04e1aad80692f08b4137f33764c862"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.349903 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.353708 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" event={"ID":"0be683a6-13c1-4895-a59e-b5a7337633b4","Type":"ContainerStarted","Data":"0b0c42bbdb65c65ce7ad6cad6326cd9aabb74d2e5ebc370d0a97d256e1f8fec2"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.359006 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.360285 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf"] Sep 30 17:44:58 crc kubenswrapper[4797]: W0930 17:44:58.366221 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1595bfb1_9c13_4148_a2ae_075a0fb0e05b.slice/crio-4583256d3d09ce798503d98ceac2700dbf6ee57581eab6a538128dec176ef159 WatchSource:0}: Error finding container 4583256d3d09ce798503d98ceac2700dbf6ee57581eab6a538128dec176ef159: Status 404 returned error can't find the container with id 4583256d3d09ce798503d98ceac2700dbf6ee57581eab6a538128dec176ef159 Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.366410 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" event={"ID":"061a6e3e-616c-440a-9db7-3332d1c11361","Type":"ContainerStarted","Data":"48294f7c529c19ed0f400aaa3b587349c1975b8021cbd41b5c611aa6423ccd10"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.366459 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" event={"ID":"061a6e3e-616c-440a-9db7-3332d1c11361","Type":"ContainerStarted","Data":"367b3b31d2e614906a410cf159c95811a3cabc1f5a882a6e36cebe34e959875d"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.375197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" event={"ID":"af6cfc48-04fd-45f2-b891-4a9d6a484e26","Type":"ContainerStarted","Data":"2f5a81d605f85622caf1179e2dd896b375195281d7c27be3635bdda2997150c0"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.385041 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" event={"ID":"c3abe72d-d1b6-4d28-aa98-b788617fefa6","Type":"ContainerStarted","Data":"ce679df08fbd7e7e2bdd6eaf41b8a5b06b5526805050f59740588a59c65a9552"} Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.406652 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.413660 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:58.913641488 +0000 UTC m=+149.436140726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.515019 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.515655 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.01564419 +0000 UTC m=+149.538143428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.528081 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhrrh"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.546764 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.556208 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.576176 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.614212 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.616532 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.616898 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.116880624 +0000 UTC m=+149.639379862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.617130 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.617449 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.117423798 +0000 UTC m=+149.639923036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: W0930 17:44:58.619887 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0ea84ca_6302_4f7c_82c8_dfbe6f5d7530.slice/crio-95f4f1a7d46a6fc7ddb289f75303028053bc037962ed20d1dbba53f4967dacd6 WatchSource:0}: Error finding container 95f4f1a7d46a6fc7ddb289f75303028053bc037962ed20d1dbba53f4967dacd6: Status 404 returned error can't find the container with id 95f4f1a7d46a6fc7ddb289f75303028053bc037962ed20d1dbba53f4967dacd6 Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.659486 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hvlqn"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.693475 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.703100 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.710974 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wgdps"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.717781 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.718117 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.218104557 +0000 UTC m=+149.740603795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.762275 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xc7x2"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.768086 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mzs2t"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.776912 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x"] Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.820148 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.820523 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.320509799 +0000 UTC m=+149.843009037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.843579 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wk4t6" podStartSLOduration=128.843561277 podStartE2EDuration="2m8.843561277s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:58.842202913 +0000 UTC m=+149.364702151" watchObservedRunningTime="2025-09-30 17:44:58.843561277 +0000 UTC m=+149.366060515" Sep 30 17:44:58 crc kubenswrapper[4797]: I0930 17:44:58.920648 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:58 crc kubenswrapper[4797]: E0930 17:44:58.921037 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.421022674 +0000 UTC m=+149.943521902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.022254 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.022800 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.522789511 +0000 UTC m=+150.045288749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.048671 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" podStartSLOduration=129.048653341 podStartE2EDuration="2m9.048653341s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.001020946 +0000 UTC m=+149.523520184" watchObservedRunningTime="2025-09-30 17:44:59.048653341 +0000 UTC m=+149.571152579" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.115114 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6cgs6" podStartSLOduration=129.115092676 podStartE2EDuration="2m9.115092676s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.078187534 +0000 UTC m=+149.600686772" watchObservedRunningTime="2025-09-30 17:44:59.115092676 +0000 UTC m=+149.637591914" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.123136 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.123446 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.124624 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.624604369 +0000 UTC m=+150.147103607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.149730 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.167026 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ngfnz" podStartSLOduration=129.167011442 podStartE2EDuration="2m9.167011442s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.118231556 +0000 UTC m=+149.640730794" watchObservedRunningTime="2025-09-30 17:44:59.167011442 +0000 UTC m=+149.689510670" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.227111 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.227199 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.227239 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.227280 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.228905 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.72888206 +0000 UTC m=+150.251381288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.243227 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.244484 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.254864 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8nsxk" podStartSLOduration=128.254837832 podStartE2EDuration="2m8.254837832s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.245162195 +0000 UTC m=+149.767661443" watchObservedRunningTime="2025-09-30 17:44:59.254837832 +0000 UTC m=+149.777337070" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.272144 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.300006 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" podStartSLOduration=128.299978074 podStartE2EDuration="2m8.299978074s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.279682466 +0000 UTC m=+149.802181704" watchObservedRunningTime="2025-09-30 17:44:59.299978074 +0000 UTC m=+149.822477312" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.333401 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.333750 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.833707804 +0000 UTC m=+150.356207042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.335826 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.336207 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.836198078 +0000 UTC m=+150.358697316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.364153 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.391740 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.440315 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.444539 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:44:59.944512052 +0000 UTC m=+150.467011290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.455838 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.457187 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" podStartSLOduration=128.457145254 podStartE2EDuration="2m8.457145254s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.421200908 +0000 UTC m=+149.943700146" watchObservedRunningTime="2025-09-30 17:44:59.457145254 +0000 UTC m=+149.979644492" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.497729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" event={"ID":"1595bfb1-9c13-4148-a2ae-075a0fb0e05b","Type":"ContainerStarted","Data":"fb09d19d8425abf7e24f71504d5b4fbf7eb5e72d8b16943ec422bcbbaa60a079"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.497770 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" event={"ID":"1595bfb1-9c13-4148-a2ae-075a0fb0e05b","Type":"ContainerStarted","Data":"4583256d3d09ce798503d98ceac2700dbf6ee57581eab6a538128dec176ef159"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.510652 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dbsss" event={"ID":"e8e273fa-fe2e-4966-9b77-2636563e3326","Type":"ContainerStarted","Data":"14c3745099995f89045192a7af40b30d7b6ddec87cd043022d886f5a54838166"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.510707 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dbsss" event={"ID":"e8e273fa-fe2e-4966-9b77-2636563e3326","Type":"ContainerStarted","Data":"0c986ef9f30b9907a7882ec02e086dc6e6cdab7275b44738a06110f32dad17da"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.515307 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" event={"ID":"71d1aee4-a761-40b4-b965-de0c111c4f2a","Type":"ContainerStarted","Data":"d46a4bdad49ca05c62cdbd9634d2b62fa1ba683d4237d123327dea4e220279c7"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.515374 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" event={"ID":"71d1aee4-a761-40b4-b965-de0c111c4f2a","Type":"ContainerStarted","Data":"2185b844aa2452450ac73dea854b820344b29ac2f9d9b57097970ee9d1dfb3b5"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.517225 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" event={"ID":"02649832-369a-4e11-bf3d-ad7d522629b1","Type":"ContainerStarted","Data":"8e37306c91c3fbbf23d2937ff0250fdd6d02e46fb4e0bf3428b25fa70d43689f"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.517262 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" event={"ID":"02649832-369a-4e11-bf3d-ad7d522629b1","Type":"ContainerStarted","Data":"3107111d01da076c376b0c4bc44cac5cb20e2736cd15736aa5f8127cb73b6971"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.519193 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" event={"ID":"c4ba2242-1b56-498e-9b09-24429a43d24e","Type":"ContainerStarted","Data":"3f9da4206f9ceba5d7d9e10c158e2f912b4e84021896cee6a7fb098ef3fa97cf"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.521002 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" event={"ID":"bd80c06b-a7d1-4c11-842c-4dd4c5000b31","Type":"ContainerStarted","Data":"a91c5fe4ed58bae8b657d85fd961d7dd5e3068c02d508edaab4bd66773b34118"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.521039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" event={"ID":"bd80c06b-a7d1-4c11-842c-4dd4c5000b31","Type":"ContainerStarted","Data":"048ffa4a1916962ff439ad0af944eb8c48e7d96df0fcfbfcaad858a1d9c286f3"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.525634 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" event={"ID":"70026aa5-47c8-4f0b-a942-ca0fa33436bc","Type":"ContainerStarted","Data":"69f5da9c4f9809f34b47aac07058fdfe874d1cf22b6aebad7054108540cea773"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.525672 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" event={"ID":"70026aa5-47c8-4f0b-a942-ca0fa33436bc","Type":"ContainerStarted","Data":"5276353b63fe0a31a8409529a8a58fa8ecf7e4ea00fbb529f136fc10db2f1ad5"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.530675 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" event={"ID":"a97c031e-14c5-4a4a-8bed-1e9729838102","Type":"ContainerStarted","Data":"7796698a6fe9d0622c0e70d6662f699df797ef6ffa2285f9b83089d20fa83d67"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.533216 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dhl6s" event={"ID":"4255c7fa-ba59-4363-ad05-cb5e7a8628e4","Type":"ContainerStarted","Data":"12ab5e32b425409fe8c8a756bd8719887daac51ddd99cf2fb3655e360becebc9"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.534276 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.536151 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8p4lg" podStartSLOduration=128.53613555 podStartE2EDuration="2m8.53613555s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.535444412 +0000 UTC m=+150.057943650" watchObservedRunningTime="2025-09-30 17:44:59.53613555 +0000 UTC m=+150.058634798" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.545422 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.545495 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.546302 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.546738 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.04672298 +0000 UTC m=+150.569222218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.556683 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tgkt" podStartSLOduration=129.556665184 podStartE2EDuration="2m9.556665184s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.556141861 +0000 UTC m=+150.078641119" watchObservedRunningTime="2025-09-30 17:44:59.556665184 +0000 UTC m=+150.079164422" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.559642 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" event={"ID":"e6ccf165-5613-42ca-bcc3-6c44c9db369a","Type":"ContainerStarted","Data":"f32c1e8e5e42a1997440596f85208fe4272d81a6910e67ec2f90a741474db292"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.573644 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" event={"ID":"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8","Type":"ContainerStarted","Data":"79b897c8f650a1a29348d2514da9985a3ad38bd115abd9b3edb66e9135262a79"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.585786 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" event={"ID":"b65fb47f-d3bc-40dc-8352-27edf856762c","Type":"ContainerStarted","Data":"c2ea512d3872f2f2a63f30edb265a07dc448ecf702baf3c4fd1ba7439d706f32"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.605715 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4t9nl" podStartSLOduration=128.605698605 podStartE2EDuration="2m8.605698605s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.604390772 +0000 UTC m=+150.126890020" watchObservedRunningTime="2025-09-30 17:44:59.605698605 +0000 UTC m=+150.128197833" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.626299 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" event={"ID":"ebfcdd10-b87b-4d46-9bea-3d1b98273a28","Type":"ContainerStarted","Data":"656281cf4a1bfe5290ba254e7ac92c8ccf5a735acf647d86b4ff6338ac7d8350"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.634796 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zd772" podStartSLOduration=128.634779997 podStartE2EDuration="2m8.634779997s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.633981197 +0000 UTC m=+150.156480435" watchObservedRunningTime="2025-09-30 17:44:59.634779997 +0000 UTC m=+150.157279235" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.645288 4797 generic.go:334] "Generic (PLEG): container finished" podID="0be683a6-13c1-4895-a59e-b5a7337633b4" containerID="3f30de3b2d11cca979b29e849129bc915f2cd9fd2210309b941b7412238213be" exitCode=0 Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.645725 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" event={"ID":"0be683a6-13c1-4895-a59e-b5a7337633b4","Type":"ContainerDied","Data":"3f30de3b2d11cca979b29e849129bc915f2cd9fd2210309b941b7412238213be"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.647265 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.647359 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.147336947 +0000 UTC m=+150.669836195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.647510 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.649263 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.149246876 +0000 UTC m=+150.671746114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.658332 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" event={"ID":"b586a7c7-c606-496d-8838-9cdf9790d4d0","Type":"ContainerStarted","Data":"0c2ce64f4b72fa9d7fb01dd6cea90c015214abc26e369153232e3d88d59bea50"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.687361 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" event={"ID":"c3abe72d-d1b6-4d28-aa98-b788617fefa6","Type":"ContainerStarted","Data":"302a8098d04f1832d9f549c93e8e1d62a126e6d08e59750def95017b8e11a531"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.689236 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" event={"ID":"2859bf41-057a-48e9-b94f-3be799ab24ee","Type":"ContainerStarted","Data":"44530a0bad0c9ab15c2e5bbd517a95674f9fe073ab39c2d3b8c9160310032192"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.689259 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" event={"ID":"2859bf41-057a-48e9-b94f-3be799ab24ee","Type":"ContainerStarted","Data":"47b96dc812fce67748aa29af332fe0f5f98d516ad5bbe1d45aee2218ea9a69bd"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.690537 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.692387 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mzs2t" event={"ID":"be38c1f4-08a9-4beb-8733-ef379fad6cf8","Type":"ContainerStarted","Data":"c74798d578725956bf29dc6d44eef449664f358f0b22651158c0f003c56600f3"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.694233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" event={"ID":"03a9880c-d077-47c8-b93a-d96cf7dced9c","Type":"ContainerStarted","Data":"6bfac2f32579b96057c8a528b7e7756b3cd278dc471dc9b6f16aea0ce1bde4b2"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.694265 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" event={"ID":"03a9880c-d077-47c8-b93a-d96cf7dced9c","Type":"ContainerStarted","Data":"e3a5ea64eba9f2bdda7e35696c65717d62c0308d3fbf420cad387c7b03ae6362"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.694843 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.700485 4797 patch_prober.go:28] interesting pod/console-operator-58897d9998-gh7d7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.700549 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" podUID="2859bf41-057a-48e9-b94f-3be799ab24ee" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.700676 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dhrrh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.700699 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.714149 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ndhrr" podStartSLOduration=128.714126152 podStartE2EDuration="2m8.714126152s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.700994377 +0000 UTC m=+150.223493615" watchObservedRunningTime="2025-09-30 17:44:59.714126152 +0000 UTC m=+150.236625390" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.748066 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.748925 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.24890732 +0000 UTC m=+150.771406558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.749016 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dbsss" podStartSLOduration=5.748990952 podStartE2EDuration="5.748990952s" podCreationTimestamp="2025-09-30 17:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.731950807 +0000 UTC m=+150.254450045" watchObservedRunningTime="2025-09-30 17:44:59.748990952 +0000 UTC m=+150.271490190" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.757343 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" event={"ID":"9e295c6d-6b86-4912-a202-31b22488bd57","Type":"ContainerStarted","Data":"a7b02a958b26ac6fc3d9a91c7a87eae921d5fc36d00bf2a7373aaa092e3a8fc9"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.800365 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dhl6s" podStartSLOduration=129.800350122 podStartE2EDuration="2m9.800350122s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.755119038 +0000 UTC m=+150.277618286" watchObservedRunningTime="2025-09-30 17:44:59.800350122 +0000 UTC m=+150.322849360" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.830351 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" event={"ID":"801fce17-d3ba-4444-888a-872baf5698ca","Type":"ContainerStarted","Data":"b5bd59a9335628b3414f20f7d73954a5f010aea2196ce0be6fc7c9f04a5f80c0"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.830388 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" event={"ID":"801fce17-d3ba-4444-888a-872baf5698ca","Type":"ContainerStarted","Data":"907ef9925e173ddb096c91351bceae4188a0799b66e5341dfe56bcda06274e84"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.849858 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.850902 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" podStartSLOduration=128.850886251 podStartE2EDuration="2m8.850886251s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.849170397 +0000 UTC m=+150.371669635" watchObservedRunningTime="2025-09-30 17:44:59.850886251 +0000 UTC m=+150.373385489" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.851873 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zdgmq" podStartSLOduration=128.851866946 podStartE2EDuration="2m8.851866946s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.800588728 +0000 UTC m=+150.323087966" watchObservedRunningTime="2025-09-30 17:44:59.851866946 +0000 UTC m=+150.374366184" Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.852738 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.352719058 +0000 UTC m=+150.875218296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.885423 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvlqn" event={"ID":"8b5e8fb3-229c-474b-86e8-d52c08555c5e","Type":"ContainerStarted","Data":"d706e4f9e9f8198eabbcdc11c5219c5476bf214f6af43e2ffca52dae148d1d22"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.905090 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" event={"ID":"0dd5e175-89ae-4e6e-9134-56d10a1974c5","Type":"ContainerStarted","Data":"52d2376499e3173decc31e80bf630b5e077ed09ca763b5ae35de6ec56008e5f8"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.905448 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" event={"ID":"0dd5e175-89ae-4e6e-9134-56d10a1974c5","Type":"ContainerStarted","Data":"ddba231749194a04d4e9dba3909e49ce5721e1cbb7c966d3d4d679358dfa23ba"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.905952 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.917410 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" podStartSLOduration=129.917384038 podStartE2EDuration="2m9.917384038s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.91274533 +0000 UTC m=+150.435244568" watchObservedRunningTime="2025-09-30 17:44:59.917384038 +0000 UTC m=+150.439883276" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.920661 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jzbcv" event={"ID":"cf3e5bac-0992-4ba7-9899-44d44f898977","Type":"ContainerStarted","Data":"a4e124248132ba39da30316fb3bcfa618d3ca13c9161de3a9b9e520ba778411e"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.938365 4797 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jgd8l container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.938413 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.953251 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:44:59 crc kubenswrapper[4797]: E0930 17:44:59.954159 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.454144266 +0000 UTC m=+150.976643504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.966709 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" event={"ID":"4370e92c-7565-4172-b6f5-4a805a338231","Type":"ContainerStarted","Data":"4e803bbdb0df1efca22caaed4aed0680e62b6ffa23d62fff0ac453e484cd7b7b"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.966743 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" event={"ID":"4370e92c-7565-4172-b6f5-4a805a338231","Type":"ContainerStarted","Data":"2d5931124bae869196682292c1225ae7efd65774c7357bb975f5390b8e0b1ec4"} Sep 30 17:44:59 crc kubenswrapper[4797]: I0930 17:44:59.996144 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fbnc4" podStartSLOduration=129.996128648 podStartE2EDuration="2m9.996128648s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.968070682 +0000 UTC m=+150.490569940" watchObservedRunningTime="2025-09-30 17:44:59.996128648 +0000 UTC m=+150.518627886" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.022515 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2qqd" podStartSLOduration=130.02249336 podStartE2EDuration="2m10.02249336s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:44:59.996598189 +0000 UTC m=+150.519097487" watchObservedRunningTime="2025-09-30 17:45:00.02249336 +0000 UTC m=+150.544992598" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.033526 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" podStartSLOduration=129.033504181 podStartE2EDuration="2m9.033504181s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.030536235 +0000 UTC m=+150.553035493" watchObservedRunningTime="2025-09-30 17:45:00.033504181 +0000 UTC m=+150.556003429" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.037600 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" event={"ID":"a1531e53-84bf-4286-8341-10b43cafdeb6","Type":"ContainerStarted","Data":"6221a174f7ae15180eeb0452a4548291b975d388b2ae400bcd0fc86aee06ddbb"} Sep 30 17:45:00 crc kubenswrapper[4797]: W0930 17:45:00.045460 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-fe5fadbaae5cb631c718a62997cf73cfce15f35a31fea3ea79322028f5e71039 WatchSource:0}: Error finding container fe5fadbaae5cb631c718a62997cf73cfce15f35a31fea3ea79322028f5e71039: Status 404 returned error can't find the container with id fe5fadbaae5cb631c718a62997cf73cfce15f35a31fea3ea79322028f5e71039 Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.048930 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" event={"ID":"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530","Type":"ContainerStarted","Data":"e4f5b3ee84fbdfd7cc551e5d66992b785ca930b4ba152bd2b0c1ead8362cda53"} Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.049059 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.049073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" event={"ID":"d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530","Type":"ContainerStarted","Data":"95f4f1a7d46a6fc7ddb289f75303028053bc037962ed20d1dbba53f4967dacd6"} Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.054548 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" event={"ID":"b789d400-5446-4d1e-9b8a-c563b150668d","Type":"ContainerStarted","Data":"44a988d0230ebe0dffacd1808a144290c9acac45e238dbbdefacb0ebc4d04def"} Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.055009 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.055762 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.555745599 +0000 UTC m=+151.078244837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.056821 4797 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8pvns container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.056863 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" podUID="b789d400-5446-4d1e-9b8a-c563b150668d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.055377 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.060261 4797 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-x4crt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.060307 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" podUID="d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.077284 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s5vh7" podStartSLOduration=129.077268988 podStartE2EDuration="2m9.077268988s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.075445601 +0000 UTC m=+150.597944839" watchObservedRunningTime="2025-09-30 17:45:00.077268988 +0000 UTC m=+150.599768226" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.099925 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" event={"ID":"5a534e74-44b0-4953-92bd-97a47ea3e2be","Type":"ContainerStarted","Data":"cddd348152d5b16d57ec9fbc48b96acb413c5aee8537ebd66a7f613c14d5e2f8"} Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.132356 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" event={"ID":"0cea082a-65b0-457d-8418-0c6247118e92","Type":"ContainerStarted","Data":"f723830b8fb35bdb09e9668b3c9cd335117c9a1f7014fcd63e60dc24fe242d05"} Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.134847 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jzbcv" podStartSLOduration=129.134834926 podStartE2EDuration="2m9.134834926s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.133254247 +0000 UTC m=+150.655753485" watchObservedRunningTime="2025-09-30 17:45:00.134834926 +0000 UTC m=+150.657334154" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.145405 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" event={"ID":"9e446878-fbd1-4260-9fe9-f04f6f13172d","Type":"ContainerStarted","Data":"dec3f03d09eaa6e7748f907df9b53cc0b2392db396eb4cd4d2e6d1ca58aa19ca"} Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.145796 4797 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kpfg6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.145829 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.157556 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.158050 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.158794 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x"] Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.159239 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.659216429 +0000 UTC m=+151.181715667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.160104 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" podStartSLOduration=129.160094561 podStartE2EDuration="2m9.160094561s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.151930383 +0000 UTC m=+150.674429651" watchObservedRunningTime="2025-09-30 17:45:00.160094561 +0000 UTC m=+150.682593799" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.199586 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr"] Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.200219 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.200637 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr"] Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.203535 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" podStartSLOduration=129.20352505 podStartE2EDuration="2m9.20352505s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.197139647 +0000 UTC m=+150.719638895" watchObservedRunningTime="2025-09-30 17:45:00.20352505 +0000 UTC m=+150.726024278" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.236258 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6kblt" podStartSLOduration=129.236239925 podStartE2EDuration="2m9.236239925s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.233543856 +0000 UTC m=+150.756043104" watchObservedRunningTime="2025-09-30 17:45:00.236239925 +0000 UTC m=+150.758739163" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.259822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.264616 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.764598238 +0000 UTC m=+151.287097586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.279513 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" podStartSLOduration=129.279492968 podStartE2EDuration="2m9.279492968s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:00.269605645 +0000 UTC m=+150.792104903" watchObservedRunningTime="2025-09-30 17:45:00.279492968 +0000 UTC m=+150.801992206" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.360410 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.360670 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.860649209 +0000 UTC m=+151.383148447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.360709 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/542a58fe-9654-424c-90dc-6d073f486328-secret-volume\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.360818 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.360923 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/542a58fe-9654-424c-90dc-6d073f486328-config-volume\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.360944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q7r\" (UniqueName: \"kubernetes.io/projected/542a58fe-9654-424c-90dc-6d073f486328-kube-api-access-s9q7r\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.361123 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.861112061 +0000 UTC m=+151.383611299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.467253 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.467442 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/542a58fe-9654-424c-90dc-6d073f486328-secret-volume\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.467522 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q7r\" (UniqueName: \"kubernetes.io/projected/542a58fe-9654-424c-90dc-6d073f486328-kube-api-access-s9q7r\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.467579 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/542a58fe-9654-424c-90dc-6d073f486328-config-volume\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.467694 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:00.96767648 +0000 UTC m=+151.490175718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.469519 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/542a58fe-9654-424c-90dc-6d073f486328-config-volume\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.472574 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/542a58fe-9654-424c-90dc-6d073f486328-secret-volume\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.495297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q7r\" (UniqueName: \"kubernetes.io/projected/542a58fe-9654-424c-90dc-6d073f486328-kube-api-access-s9q7r\") pod \"collect-profiles-29320905-9m9zr\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.548707 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.569646 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.570098 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.070082513 +0000 UTC m=+151.592581751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.672545 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.673604 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.173580334 +0000 UTC m=+151.696079572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.704990 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.731260 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:00 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:00 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:00 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.731326 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.788467 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.788992 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.288966848 +0000 UTC m=+151.811466086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.794964 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr"] Sep 30 17:45:00 crc kubenswrapper[4797]: W0930 17:45:00.865168 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542a58fe_9654_424c_90dc_6d073f486328.slice/crio-abec4a89d14b78d3fe11a1e5fdaae16b55845de81a18b444f4e0bf503ab04051 WatchSource:0}: Error finding container abec4a89d14b78d3fe11a1e5fdaae16b55845de81a18b444f4e0bf503ab04051: Status 404 returned error can't find the container with id abec4a89d14b78d3fe11a1e5fdaae16b55845de81a18b444f4e0bf503ab04051 Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.893390 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.893817 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.393799033 +0000 UTC m=+151.916298271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:00 crc kubenswrapper[4797]: I0930 17:45:00.995353 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:00 crc kubenswrapper[4797]: E0930 17:45:00.995687 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.495676443 +0000 UTC m=+152.018175681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.096238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.096495 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.596399413 +0000 UTC m=+152.118898641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.096973 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.097267 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.597257545 +0000 UTC m=+152.119756783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.151624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mzs2t" event={"ID":"be38c1f4-08a9-4beb-8733-ef379fad6cf8","Type":"ContainerStarted","Data":"f6c84559a10d090a9cf259fb2eb13063feb3ef28fcbba11be39d75ac28840a2c"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.153747 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" event={"ID":"5a534e74-44b0-4953-92bd-97a47ea3e2be","Type":"ContainerStarted","Data":"a922ba2fae2e372ee6eb09a976e59ab693c5d7e00ab005bdcffab1973412c9b8"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.153788 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" event={"ID":"5a534e74-44b0-4953-92bd-97a47ea3e2be","Type":"ContainerStarted","Data":"63c7b24093373a39f6766cae5ec9dd58c3568f9db730f21a1899a4ccd86589a5"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.155208 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e738604c28f909530e16c8dc74667f30bf15ca9b2d79fb58e66ce3a57f4e842d"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.155254 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d93b562e53de853ed20bb3a5d94568f8d1477fcd593754fcf55ef8c537fb0d12"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.157037 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" event={"ID":"70026aa5-47c8-4f0b-a942-ca0fa33436bc","Type":"ContainerStarted","Data":"5859d6ffe83dbba85760148ed1cafe446f1242dac0cbd6a855c8d546c3f6fbcb"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.158628 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dfabd894ee0d19f94b537560398906de068b42f8f3a5170023033140e850e3c0"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.158682 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fe5fadbaae5cb631c718a62997cf73cfce15f35a31fea3ea79322028f5e71039"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.161030 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" event={"ID":"ebfcdd10-b87b-4d46-9bea-3d1b98273a28","Type":"ContainerStarted","Data":"e5c8bf865dacd5de9395fd49e17f6e0d6a2c86216f5feb8060cfe8e2c3424a77"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.166288 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" event={"ID":"0be683a6-13c1-4895-a59e-b5a7337633b4","Type":"ContainerStarted","Data":"bb531a5bddbeddb65c55f7f4aa91c4871772f3e3299c38794351ae67fed85fd9"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.168624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3bb921e1e3d46b13aa6e019095f8960fc901e5e56be19fc891b2653676a5e22e"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.168677 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d1181e92d21e21e2fb710204243636eec8c28d0b0bfee13641f21441de6c98f2"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.169065 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mzs2t" podStartSLOduration=7.169049797 podStartE2EDuration="7.169049797s" podCreationTimestamp="2025-09-30 17:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:01.166460261 +0000 UTC m=+151.688959499" watchObservedRunningTime="2025-09-30 17:45:01.169049797 +0000 UTC m=+151.691549035" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.174051 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" podUID="c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" containerName="collect-profiles" containerID="cri-o://222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce" gracePeriod=30 Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.174372 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" event={"ID":"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8","Type":"ContainerStarted","Data":"222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.181089 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" event={"ID":"71d1aee4-a761-40b4-b965-de0c111c4f2a","Type":"ContainerStarted","Data":"2f7816dafe514b0b806aad534882653f22698ff9928dd1c67328324598a6d23f"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.183454 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" event={"ID":"02649832-369a-4e11-bf3d-ad7d522629b1","Type":"ContainerStarted","Data":"deb1c2d4ab30bdfe8894556093331412ad118cf6b9c8fd7ba76bb0479bfb060c"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.189580 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" event={"ID":"542a58fe-9654-424c-90dc-6d073f486328","Type":"ContainerStarted","Data":"abec4a89d14b78d3fe11a1e5fdaae16b55845de81a18b444f4e0bf503ab04051"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.193062 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" podStartSLOduration=131.193049299 podStartE2EDuration="2m11.193049299s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:01.192015703 +0000 UTC m=+151.714514941" watchObservedRunningTime="2025-09-30 17:45:01.193049299 +0000 UTC m=+151.715548527" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.194201 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" event={"ID":"b789d400-5446-4d1e-9b8a-c563b150668d","Type":"ContainerStarted","Data":"8114730afcfacc8134f61cc0d51cf8d834c27be01ad2fc567d1ebef363d429fd"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.194806 4797 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8pvns container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.194846 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" podUID="b789d400-5446-4d1e-9b8a-c563b150668d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.196215 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" event={"ID":"bd80c06b-a7d1-4c11-842c-4dd4c5000b31","Type":"ContainerStarted","Data":"b19cd39d8e6fd81c25c7ed03b4bbef5f0014945c1511f97c585a0996d68c55f2"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.197530 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.197855 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.697837641 +0000 UTC m=+152.220336879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.198058 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvlqn" event={"ID":"8b5e8fb3-229c-474b-86e8-d52c08555c5e","Type":"ContainerStarted","Data":"9105275daff36aec1e9caad9b5275469ce68c246740d0e526a17b47975e4211b"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.204966 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" event={"ID":"b586a7c7-c606-496d-8838-9cdf9790d4d0","Type":"ContainerStarted","Data":"c60917d8cc831819b01a22c13b216d4749012909fcd7055e58b23345d926bcb4"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.205028 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" event={"ID":"b586a7c7-c606-496d-8838-9cdf9790d4d0","Type":"ContainerStarted","Data":"28799fcee345135efd3ce9b35982eba8cf950f35673e6e23127719673ec040a8"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.211320 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" event={"ID":"e6ccf165-5613-42ca-bcc3-6c44c9db369a","Type":"ContainerStarted","Data":"194d15f51dae3268a1fd9e892100a7980e243ff42935a37524fd659d0f97ac2d"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.211510 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.212793 4797 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-764b6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.212840 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" podUID="e6ccf165-5613-42ca-bcc3-6c44c9db369a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.214947 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wgdps" event={"ID":"0cea082a-65b0-457d-8418-0c6247118e92","Type":"ContainerStarted","Data":"c9266f74336f97fc539310610f3bbd4bd1340683f5c30b697861753cc331a80c"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.225266 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" event={"ID":"9e446878-fbd1-4260-9fe9-f04f6f13172d","Type":"ContainerStarted","Data":"2a926e74b5119f0c5709fcee44001af02685fe15575d98c54942578edef6e7e6"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.225320 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" event={"ID":"9e446878-fbd1-4260-9fe9-f04f6f13172d","Type":"ContainerStarted","Data":"6918e48442977713117b69443fb9de83cb1340e0ff11068dab1d0d6028630420"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.227380 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" event={"ID":"a97c031e-14c5-4a4a-8bed-1e9729838102","Type":"ContainerStarted","Data":"d27ea993aabbf791fcd0b69ddee110e3060a79447aa80dfc7f2ef9d91c3f3642"} Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.227953 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.227957 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dhrrh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.228001 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.228055 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.228678 4797 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jgd8l container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.228711 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.229664 4797 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-x4crt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.229699 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" podUID="d0ea84ca-6302-4f7c-82c8-dfbe6f5d7530" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.230141 4797 patch_prober.go:28] interesting pod/console-operator-58897d9998-gh7d7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.230163 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" podUID="2859bf41-057a-48e9-b94f-3be799ab24ee" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.250157 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-frbm9" podStartSLOduration=130.250138426 podStartE2EDuration="2m10.250138426s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:01.217684378 +0000 UTC m=+151.740183616" watchObservedRunningTime="2025-09-30 17:45:01.250138426 +0000 UTC m=+151.772637664" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.250735 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" podStartSLOduration=130.250728121 podStartE2EDuration="2m10.250728121s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:01.248126645 +0000 UTC m=+151.770625883" watchObservedRunningTime="2025-09-30 17:45:01.250728121 +0000 UTC m=+151.773227369" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.287821 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rmh7z" podStartSLOduration=130.287805667 podStartE2EDuration="2m10.287805667s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:01.283201379 +0000 UTC m=+151.805700607" watchObservedRunningTime="2025-09-30 17:45:01.287805667 +0000 UTC m=+151.810304905" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.299782 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.308747 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.80873021 +0000 UTC m=+152.331229448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.351525 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.351793 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.368620 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" podStartSLOduration=130.368593248 podStartE2EDuration="2m10.368593248s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:01.361815195 +0000 UTC m=+151.884314433" watchObservedRunningTime="2025-09-30 17:45:01.368593248 +0000 UTC m=+151.891092486" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.373302 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.400761 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.402946 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:01.902918284 +0000 UTC m=+152.425417522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.503505 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.503953 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.003936642 +0000 UTC m=+152.526435870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.529084 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.604679 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.604878 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.104854067 +0000 UTC m=+152.627353305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.605195 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.605538 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.105529764 +0000 UTC m=+152.628029002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.705878 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.706081 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.206053909 +0000 UTC m=+152.728553157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.706182 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.706545 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.206534351 +0000 UTC m=+152.729033599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.707332 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:01 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:01 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:01 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.707376 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.806835 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.807014 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.306988414 +0000 UTC m=+152.829487642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.807141 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.807485 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.307477177 +0000 UTC m=+152.829976405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.811315 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_collect-profiles-29320890-q8f6x_c7783dc1-a72a-4c0d-a78a-eac1d725b1d8/collect-profiles/0.log" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.811367 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.821951 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.822253 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.823355 4797 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sx5xp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.823390 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" podUID="ebfcdd10-b87b-4d46-9bea-3d1b98273a28" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.908633 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpphp\" (UniqueName: \"kubernetes.io/projected/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-kube-api-access-lpphp\") pod \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.908728 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-secret-volume\") pod \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.908833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.908858 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-config-volume\") pod \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\" (UID: \"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8\") " Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.908992 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.408972017 +0000 UTC m=+152.931471255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.909126 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:01 crc kubenswrapper[4797]: E0930 17:45:01.909428 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.409417908 +0000 UTC m=+152.931917146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.909534 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" (UID: "c7783dc1-a72a-4c0d-a78a-eac1d725b1d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.917865 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-kube-api-access-lpphp" (OuterVolumeSpecName: "kube-api-access-lpphp") pod "c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" (UID: "c7783dc1-a72a-4c0d-a78a-eac1d725b1d8"). InnerVolumeSpecName "kube-api-access-lpphp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:01 crc kubenswrapper[4797]: I0930 17:45:01.917900 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" (UID: "c7783dc1-a72a-4c0d-a78a-eac1d725b1d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.009933 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.010096 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.510069437 +0000 UTC m=+153.032568675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.010476 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.010577 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpphp\" (UniqueName: \"kubernetes.io/projected/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-kube-api-access-lpphp\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.010590 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.010598 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.010840 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.510828826 +0000 UTC m=+153.033328074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.112095 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.112282 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.612256084 +0000 UTC m=+153.134755322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.112379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.112686 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.612678184 +0000 UTC m=+153.135177412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.213773 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.213892 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.713871467 +0000 UTC m=+153.236370715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.214074 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.214358 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.714348009 +0000 UTC m=+153.236847267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.233990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" event={"ID":"542a58fe-9654-424c-90dc-6d073f486328","Type":"ContainerStarted","Data":"68560197d586281eddf9b016d42438c6820920ce9461182ad4bb1db69e97865f"} Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.235524 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvlqn" event={"ID":"8b5e8fb3-229c-474b-86e8-d52c08555c5e","Type":"ContainerStarted","Data":"199dc28f1913c981bf4d6a278d61667e37f996e94d01a5a676231e9e240d056a"} Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.235659 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hvlqn" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.236697 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_collect-profiles-29320890-q8f6x_c7783dc1-a72a-4c0d-a78a-eac1d725b1d8/collect-profiles/0.log" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.236732 4797 generic.go:334] "Generic (PLEG): container finished" podID="c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" containerID="222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce" exitCode=2 Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.236770 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.236809 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" event={"ID":"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8","Type":"ContainerDied","Data":"222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce"} Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.236836 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x" event={"ID":"c7783dc1-a72a-4c0d-a78a-eac1d725b1d8","Type":"ContainerDied","Data":"79b897c8f650a1a29348d2514da9985a3ad38bd115abd9b3edb66e9135262a79"} Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.236856 4797 scope.go:117] "RemoveContainer" containerID="222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237706 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237750 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237802 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dhrrh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237811 4797 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8pvns container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237811 4797 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-764b6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237858 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" podUID="b789d400-5446-4d1e-9b8a-c563b150668d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.237880 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" podUID="e6ccf165-5613-42ca-bcc3-6c44c9db369a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.239404 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.239738 4797 patch_prober.go:28] interesting pod/console-operator-58897d9998-gh7d7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.239793 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" podUID="2859bf41-057a-48e9-b94f-3be799ab24ee" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.252823 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xqsxf" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.253975 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" podStartSLOduration=2.2539460399999998 podStartE2EDuration="2.25394604s" podCreationTimestamp="2025-09-30 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.253347855 +0000 UTC m=+152.775847113" watchObservedRunningTime="2025-09-30 17:45:02.25394604 +0000 UTC m=+152.776445278" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.278072 4797 scope.go:117] "RemoveContainer" containerID="222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.278536 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce\": container with ID starting with 222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce not found: ID does not exist" containerID="222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.278572 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce"} err="failed to get container status \"222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce\": rpc error: code = NotFound desc = could not find container \"222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce\": container with ID starting with 222cceba312cab95e78c05766959736aeda3ba55d084a5ee7af65602744202ce not found: ID does not exist" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.279658 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bkdjn" podStartSLOduration=131.279648525 podStartE2EDuration="2m11.279648525s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.277724866 +0000 UTC m=+152.800224114" watchObservedRunningTime="2025-09-30 17:45:02.279648525 +0000 UTC m=+152.802147753" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.314140 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" podStartSLOduration=132.314121395 podStartE2EDuration="2m12.314121395s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.312787681 +0000 UTC m=+152.835286919" watchObservedRunningTime="2025-09-30 17:45:02.314121395 +0000 UTC m=+152.836620633" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.316048 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.316395 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.816380563 +0000 UTC m=+153.338879801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.348869 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hvlqn" podStartSLOduration=8.348848621 podStartE2EDuration="8.348848621s" podCreationTimestamp="2025-09-30 17:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.330974355 +0000 UTC m=+152.853473603" watchObservedRunningTime="2025-09-30 17:45:02.348848621 +0000 UTC m=+152.871347859" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.360012 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rk7f6" podStartSLOduration=131.359995176 podStartE2EDuration="2m11.359995176s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.347740523 +0000 UTC m=+152.870239761" watchObservedRunningTime="2025-09-30 17:45:02.359995176 +0000 UTC m=+152.882494414" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.427661 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.435109 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:02.935088401 +0000 UTC m=+153.457587629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.463932 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" podStartSLOduration=131.463910967 podStartE2EDuration="2m11.463910967s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.46126422 +0000 UTC m=+152.983763458" watchObservedRunningTime="2025-09-30 17:45:02.463910967 +0000 UTC m=+152.986410205" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.531384 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ntb2" podStartSLOduration=131.531367349 podStartE2EDuration="2m11.531367349s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.529016188 +0000 UTC m=+153.051515436" watchObservedRunningTime="2025-09-30 17:45:02.531367349 +0000 UTC m=+153.053866587" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.536020 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.536062 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.036039948 +0000 UTC m=+153.558539186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.536250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.536547 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.03653145 +0000 UTC m=+153.559030688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.569334 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9dt8d" podStartSLOduration=131.569315646 podStartE2EDuration="2m11.569315646s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.567290055 +0000 UTC m=+153.089789293" watchObservedRunningTime="2025-09-30 17:45:02.569315646 +0000 UTC m=+153.091814884" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.637262 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.637408 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.137388294 +0000 UTC m=+153.659887532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.637745 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.638040 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.13802952 +0000 UTC m=+153.660528758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.679800 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2fwf" podStartSLOduration=131.679782906 podStartE2EDuration="2m11.679782906s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:02.67802375 +0000 UTC m=+153.200522988" watchObservedRunningTime="2025-09-30 17:45:02.679782906 +0000 UTC m=+153.202282144" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.709385 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:02 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:02 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:02 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.709463 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.717911 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x"] Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.720280 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-q8f6x"] Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.738565 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.738732 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.238707279 +0000 UTC m=+153.761206517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.738777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.739113 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.239098709 +0000 UTC m=+153.761597947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.840032 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.840209 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.340184958 +0000 UTC m=+153.862684196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.840267 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.840626 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.340614019 +0000 UTC m=+153.863113257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.940789 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.941031 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.440994241 +0000 UTC m=+153.963493519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:02 crc kubenswrapper[4797]: I0930 17:45:02.941104 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:02 crc kubenswrapper[4797]: E0930 17:45:02.941378 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.44136556 +0000 UTC m=+153.963864798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.042224 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.042540 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.54249773 +0000 UTC m=+154.064997008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.042611 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.042943 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.542928881 +0000 UTC m=+154.065428189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.141904 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.143333 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.143540 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.643524058 +0000 UTC m=+154.166023296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.143640 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.143724 4797 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7n6zc container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.143727 4797 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7n6zc container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.143776 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" podUID="0be683a6-13c1-4895-a59e-b5a7337633b4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.143812 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" podUID="0be683a6-13c1-4895-a59e-b5a7337633b4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.143885 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.643876668 +0000 UTC m=+154.166375906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.243276 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" event={"ID":"b65fb47f-d3bc-40dc-8352-27edf856762c","Type":"ContainerStarted","Data":"445de6f8eead7ab6e719eaf1f4e03d166714db3e7016ff32edbaea9bec745886"} Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.244191 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.244300 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.744281249 +0000 UTC m=+154.266780487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.244348 4797 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7n6zc container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.244385 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.244381 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" podUID="0be683a6-13c1-4895-a59e-b5a7337633b4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.244702 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.744686699 +0000 UTC m=+154.267185937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.345493 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.345622 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.845603856 +0000 UTC m=+154.368103094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.346029 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.347608 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.847600016 +0000 UTC m=+154.370099254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.447666 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.447878 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.947847004 +0000 UTC m=+154.470346242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.447946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.448286 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:03.948271435 +0000 UTC m=+154.470770683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.548744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.548900 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.048874432 +0000 UTC m=+154.571373670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.549084 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.549409 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.049401085 +0000 UTC m=+154.571900323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.652693 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.652826 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.152809724 +0000 UTC m=+154.675308962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.652935 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.653290 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.153282486 +0000 UTC m=+154.675781724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.708320 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:03 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:03 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:03 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.708394 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.753734 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.754056 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.254042317 +0000 UTC m=+154.776541545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.764527 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.764718 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" containerName="collect-profiles" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.764729 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" containerName="collect-profiles" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.764810 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" containerName="collect-profiles" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.765124 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.767859 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.769386 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.815757 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.855391 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.855711 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.355698271 +0000 UTC m=+154.878197509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.956912 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.957044 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.457025457 +0000 UTC m=+154.979524695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.957093 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.957128 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50bec08-3d51-4bd4-903f-c564b219b400-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:03 crc kubenswrapper[4797]: I0930 17:45:03.957156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50bec08-3d51-4bd4-903f-c564b219b400-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:03 crc kubenswrapper[4797]: E0930 17:45:03.957426 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.457418717 +0000 UTC m=+154.979917955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.058584 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.058717 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.558698571 +0000 UTC m=+155.081197809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.058937 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.058975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50bec08-3d51-4bd4-903f-c564b219b400-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.059005 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50bec08-3d51-4bd4-903f-c564b219b400-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.059119 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50bec08-3d51-4bd4-903f-c564b219b400-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.059241 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.559233154 +0000 UTC m=+155.081732392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.076573 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50bec08-3d51-4bd4-903f-c564b219b400-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.078832 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.159769 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.160030 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.660001476 +0000 UTC m=+155.182500734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.160292 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.160588 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.66057551 +0000 UTC m=+155.183074748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.245480 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7783dc1-a72a-4c0d-a78a-eac1d725b1d8" path="/var/lib/kubelet/pods/c7783dc1-a72a-4c0d-a78a-eac1d725b1d8/volumes" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.261230 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.261651 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.76163599 +0000 UTC m=+155.284135228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.383364 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.384823 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.884808252 +0000 UTC m=+155.407307490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.464065 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:45:04 crc kubenswrapper[4797]: W0930 17:45:04.483358 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd50bec08_3d51_4bd4_903f_c564b219b400.slice/crio-89a49a28d4ad72aabc4ec6380bb22918370363345e162321718371e0a6ecc4db WatchSource:0}: Error finding container 89a49a28d4ad72aabc4ec6380bb22918370363345e162321718371e0a6ecc4db: Status 404 returned error can't find the container with id 89a49a28d4ad72aabc4ec6380bb22918370363345e162321718371e0a6ecc4db Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.484043 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.484346 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:04.984317811 +0000 UTC m=+155.506817059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.585668 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.585992 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.085976475 +0000 UTC m=+155.608475713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.691260 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.693023 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.193001786 +0000 UTC m=+155.715501024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.706677 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:04 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:04 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:04 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.706730 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.806723 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.807006 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.306994635 +0000 UTC m=+155.829493873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.907136 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.907303 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.407277734 +0000 UTC m=+155.929776972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:04 crc kubenswrapper[4797]: I0930 17:45:04.907363 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:04 crc kubenswrapper[4797]: E0930 17:45:04.907638 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.407623652 +0000 UTC m=+155.930122890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.008326 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.008808 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.508787754 +0000 UTC m=+156.031286992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.110461 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.110768 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.610756446 +0000 UTC m=+156.133255684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.211685 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.211937 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.711911357 +0000 UTC m=+156.234410595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.212144 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.212420 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.71240913 +0000 UTC m=+156.234908368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.253071 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d50bec08-3d51-4bd4-903f-c564b219b400","Type":"ContainerStarted","Data":"d2fcaa243c57a92471401e304f93f1e2ae2430ff3c56fda91643613225c6537d"} Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.253113 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d50bec08-3d51-4bd4-903f-c564b219b400","Type":"ContainerStarted","Data":"89a49a28d4ad72aabc4ec6380bb22918370363345e162321718371e0a6ecc4db"} Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.268410 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.268393008 podStartE2EDuration="2.268393008s" podCreationTimestamp="2025-09-30 17:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:05.266726475 +0000 UTC m=+155.789225703" watchObservedRunningTime="2025-09-30 17:45:05.268393008 +0000 UTC m=+155.790892246" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.313680 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.314028 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.814003872 +0000 UTC m=+156.336503110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.415056 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.415399 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:05.915383349 +0000 UTC m=+156.437882587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.516090 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.516221 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.016198751 +0000 UTC m=+156.538697989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.516483 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.516876 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.016865599 +0000 UTC m=+156.539364837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.617128 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.617305 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.11727979 +0000 UTC m=+156.639779028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.617391 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.617731 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.117717542 +0000 UTC m=+156.640216780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.708278 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:05 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:05 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:05 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.708349 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.718148 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.718320 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.218299078 +0000 UTC m=+156.740798316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.718393 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.718664 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.218656598 +0000 UTC m=+156.741155836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.738246 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d48nz"] Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.739179 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.741621 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.770384 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d48nz"] Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.819449 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.819650 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.319622744 +0000 UTC m=+156.842121982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.819708 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.820010 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.320002044 +0000 UTC m=+156.842501282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.921326 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.921527 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.421502153 +0000 UTC m=+156.944001391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.921658 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-utilities\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.921690 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9r2t\" (UniqueName: \"kubernetes.io/projected/96db7793-b3a4-46c5-889e-553d7d41ed0f-kube-api-access-z9r2t\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.921747 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.921790 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-catalog-content\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:05 crc kubenswrapper[4797]: E0930 17:45:05.922117 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.422100188 +0000 UTC m=+156.944599426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.928572 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p67c6"] Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.929472 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.934628 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 17:45:05 crc kubenswrapper[4797]: I0930 17:45:05.949399 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p67c6"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022232 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022364 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-utilities\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.022406 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.522376837 +0000 UTC m=+157.044876085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022484 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9r2t\" (UniqueName: \"kubernetes.io/projected/96db7793-b3a4-46c5-889e-553d7d41ed0f-kube-api-access-z9r2t\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022566 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022630 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-catalog-content\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022659 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-utilities\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022704 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzttz\" (UniqueName: \"kubernetes.io/projected/623778df-ea9e-4898-8286-f200b9a29844-kube-api-access-bzttz\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022748 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-catalog-content\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.022759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-utilities\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.022894 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.52286437 +0000 UTC m=+157.045363608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.026862 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-catalog-content\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.039593 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9r2t\" (UniqueName: \"kubernetes.io/projected/96db7793-b3a4-46c5-889e-553d7d41ed0f-kube-api-access-z9r2t\") pod \"community-operators-d48nz\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.050772 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.124591 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vlfbv"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.125805 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.126870 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.126972 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.626953306 +0000 UTC m=+157.149452544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127255 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmnnd\" (UniqueName: \"kubernetes.io/projected/abfc63a5-f217-4b84-925a-91c4e900aaa8-kube-api-access-jmnnd\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127353 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-catalog-content\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127392 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127439 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-utilities\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127476 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-utilities\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127506 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzttz\" (UniqueName: \"kubernetes.io/projected/623778df-ea9e-4898-8286-f200b9a29844-kube-api-access-bzttz\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.127535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-catalog-content\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.128215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-catalog-content\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.128532 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.628520166 +0000 UTC m=+157.151019404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.128861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-utilities\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.145560 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7n6zc" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.159708 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlfbv"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.179968 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzttz\" (UniqueName: \"kubernetes.io/projected/623778df-ea9e-4898-8286-f200b9a29844-kube-api-access-bzttz\") pod \"certified-operators-p67c6\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.228834 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.229024 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-catalog-content\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.229087 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-utilities\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.229163 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.729142764 +0000 UTC m=+157.251642002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.229255 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmnnd\" (UniqueName: \"kubernetes.io/projected/abfc63a5-f217-4b84-925a-91c4e900aaa8-kube-api-access-jmnnd\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.229490 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-catalog-content\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.230333 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-utilities\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.246678 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.286639 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmnnd\" (UniqueName: \"kubernetes.io/projected/abfc63a5-f217-4b84-925a-91c4e900aaa8-kube-api-access-jmnnd\") pod \"community-operators-vlfbv\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.295763 4797 generic.go:334] "Generic (PLEG): container finished" podID="d50bec08-3d51-4bd4-903f-c564b219b400" containerID="d2fcaa243c57a92471401e304f93f1e2ae2430ff3c56fda91643613225c6537d" exitCode=0 Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.295830 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d50bec08-3d51-4bd4-903f-c564b219b400","Type":"ContainerDied","Data":"d2fcaa243c57a92471401e304f93f1e2ae2430ff3c56fda91643613225c6537d"} Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.324992 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqgjz"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.327497 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.330584 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.331061 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.831045643 +0000 UTC m=+157.353544881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.338842 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqgjz"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.433269 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.433483 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-catalog-content\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.433542 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rlw\" (UniqueName: \"kubernetes.io/projected/24b1d14d-8ba7-4862-ae38-d9dc28061c12-kube-api-access-v8rlw\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.433607 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:06.9335817 +0000 UTC m=+157.456080938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.433666 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-utilities\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.442048 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.534602 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.534647 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-catalog-content\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.534691 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rlw\" (UniqueName: \"kubernetes.io/projected/24b1d14d-8ba7-4862-ae38-d9dc28061c12-kube-api-access-v8rlw\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.534713 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-utilities\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.535080 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-utilities\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.535295 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.035282215 +0000 UTC m=+157.557781443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.535616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-catalog-content\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.560200 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rlw\" (UniqueName: \"kubernetes.io/projected/24b1d14d-8ba7-4862-ae38-d9dc28061c12-kube-api-access-v8rlw\") pod \"certified-operators-dqgjz\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.605657 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d48nz"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.627470 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.627501 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.636642 4797 patch_prober.go:28] interesting pod/console-f9d7485db-ngfnz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.636721 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ngfnz" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.636957 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.637257 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.137233437 +0000 UTC m=+157.659732735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.654499 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.658459 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p67c6"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.706041 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlfbv"] Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.708387 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:06 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:06 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:06 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.708455 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.738904 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.740060 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.24004505 +0000 UTC m=+157.762544298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.833995 4797 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sx5xp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]log ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]etcd ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/max-in-flight-filter ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 17:45:06 crc kubenswrapper[4797]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 17:45:06 crc kubenswrapper[4797]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/openshift.io-startinformers ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 17:45:06 crc kubenswrapper[4797]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 17:45:06 crc kubenswrapper[4797]: livez check failed Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.834340 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" podUID="ebfcdd10-b87b-4d46-9bea-3d1b98273a28" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.839871 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.840080 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.340044892 +0000 UTC m=+157.862544130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.840130 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.840401 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.34038907 +0000 UTC m=+157.862888308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.941094 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.941294 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.441262794 +0000 UTC m=+157.963762042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:06 crc kubenswrapper[4797]: I0930 17:45:06.941356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:06 crc kubenswrapper[4797]: E0930 17:45:06.941762 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.441739236 +0000 UTC m=+157.964238474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.042712 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:07 crc kubenswrapper[4797]: E0930 17:45:07.043003 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.54298335 +0000 UTC m=+158.065482598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.043314 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:07 crc kubenswrapper[4797]: E0930 17:45:07.043840 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.543831552 +0000 UTC m=+158.066330790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.103164 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqgjz"] Sep 30 17:45:07 crc kubenswrapper[4797]: W0930 17:45:07.104615 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b1d14d_8ba7_4862_ae38_d9dc28061c12.slice/crio-435f0ab0b575b0dac2ea3d3e52e2b9c148cc4c3b992112ad17ada3ca015c67f7 WatchSource:0}: Error finding container 435f0ab0b575b0dac2ea3d3e52e2b9c148cc4c3b992112ad17ada3ca015c67f7: Status 404 returned error can't find the container with id 435f0ab0b575b0dac2ea3d3e52e2b9c148cc4c3b992112ad17ada3ca015c67f7 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.144290 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:07 crc kubenswrapper[4797]: E0930 17:45:07.144803 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.644784398 +0000 UTC m=+158.167283636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.236958 4797 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.246702 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:07 crc kubenswrapper[4797]: E0930 17:45:07.247060 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.747044067 +0000 UTC m=+158.269543305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-shscm" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.262571 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.265533 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.265592 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.265634 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.265685 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.279631 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gh7d7" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.303657 4797 generic.go:334] "Generic (PLEG): container finished" podID="623778df-ea9e-4898-8286-f200b9a29844" containerID="f811ea69af862d2dc0bba1895c62d8b4bead7250a14810833c23f0bdbe020012" exitCode=0 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.303766 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerDied","Data":"f811ea69af862d2dc0bba1895c62d8b4bead7250a14810833c23f0bdbe020012"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.303819 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerStarted","Data":"5bc89ab0a7e78359ffcc31cd8f0dd75875f3bde0265b9c67c8cb2668e17d3308"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.306052 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.308108 4797 generic.go:334] "Generic (PLEG): container finished" podID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerID="7415f2de609de0360f364043822f1d55e9b693faf606ce1d3403ac9810949b56" exitCode=0 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.308216 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerDied","Data":"7415f2de609de0360f364043822f1d55e9b693faf606ce1d3403ac9810949b56"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.308244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerStarted","Data":"23f56a354ca325ec1e5e3983a5a4c242b7d82a57b12d74a2b5a4742f3d6beebb"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.320687 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerStarted","Data":"f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.320965 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerStarted","Data":"435f0ab0b575b0dac2ea3d3e52e2b9c148cc4c3b992112ad17ada3ca015c67f7"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.323837 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" event={"ID":"b65fb47f-d3bc-40dc-8352-27edf856762c","Type":"ContainerStarted","Data":"8298aecf1890bb5f0f59eda98ff9f1cb7f4f3c0f3b0236dd5b024a3d382b330d"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.323886 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" event={"ID":"b65fb47f-d3bc-40dc-8352-27edf856762c","Type":"ContainerStarted","Data":"21f68f5e0cc991b7d9a5570710d022814b2ac40668fb39ac949b3c6c5cbf0409"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.328383 4797 generic.go:334] "Generic (PLEG): container finished" podID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerID="19045b39b519949eface2feb541e4f95180c57196ff72a6f898f64eaab5f77ad" exitCode=0 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.328497 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerDied","Data":"19045b39b519949eface2feb541e4f95180c57196ff72a6f898f64eaab5f77ad"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.328535 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerStarted","Data":"ae89bf1e13341def52c8d7ab731532b6cdc9036ba2b70ce675db9c41e4557326"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.336339 4797 generic.go:334] "Generic (PLEG): container finished" podID="542a58fe-9654-424c-90dc-6d073f486328" containerID="68560197d586281eddf9b016d42438c6820920ce9461182ad4bb1db69e97865f" exitCode=0 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.336565 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" event={"ID":"542a58fe-9654-424c-90dc-6d073f486328","Type":"ContainerDied","Data":"68560197d586281eddf9b016d42438c6820920ce9461182ad4bb1db69e97865f"} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.347560 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:07 crc kubenswrapper[4797]: E0930 17:45:07.347923 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:45:07.84791037 +0000 UTC m=+158.370409598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.430749 4797 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T17:45:07.236997031Z","Handler":null,"Name":""} Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.436167 4797 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.436206 4797 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.449335 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.452304 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.452337 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.494231 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-shscm\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.545606 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.552798 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.567376 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.588569 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.650014 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.654605 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50bec08-3d51-4bd4-903f-c564b219b400-kubelet-dir\") pod \"d50bec08-3d51-4bd4-903f-c564b219b400\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.654733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50bec08-3d51-4bd4-903f-c564b219b400-kube-api-access\") pod \"d50bec08-3d51-4bd4-903f-c564b219b400\" (UID: \"d50bec08-3d51-4bd4-903f-c564b219b400\") " Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.655220 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d50bec08-3d51-4bd4-903f-c564b219b400-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d50bec08-3d51-4bd4-903f-c564b219b400" (UID: "d50bec08-3d51-4bd4-903f-c564b219b400"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.660981 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50bec08-3d51-4bd4-903f-c564b219b400-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d50bec08-3d51-4bd4-903f-c564b219b400" (UID: "d50bec08-3d51-4bd4-903f-c564b219b400"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.705764 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.707160 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4crt" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.708994 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:07 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:07 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:07 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.709058 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.710390 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.727243 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-764b6" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.738288 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8pvns" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.755871 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50bec08-3d51-4bd4-903f-c564b219b400-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.755913 4797 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50bec08-3d51-4bd4-903f-c564b219b400-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.846916 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-shscm"] Sep 30 17:45:07 crc kubenswrapper[4797]: W0930 17:45:07.857713 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b643afa_b83b_4ad0_9d8e_d950d5fba3e4.slice/crio-5025af6818c33fe652568cfdba999634ad795eeda5ba03bab5c20ed22cf3b010 WatchSource:0}: Error finding container 5025af6818c33fe652568cfdba999634ad795eeda5ba03bab5c20ed22cf3b010: Status 404 returned error can't find the container with id 5025af6818c33fe652568cfdba999634ad795eeda5ba03bab5c20ed22cf3b010 Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.918579 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wkwqb"] Sep 30 17:45:07 crc kubenswrapper[4797]: E0930 17:45:07.918880 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50bec08-3d51-4bd4-903f-c564b219b400" containerName="pruner" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.918895 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50bec08-3d51-4bd4-903f-c564b219b400" containerName="pruner" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.919018 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50bec08-3d51-4bd4-903f-c564b219b400" containerName="pruner" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.919790 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.925744 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wkwqb"] Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.929370 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.988110 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.990718 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.995099 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.996795 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 17:45:07 crc kubenswrapper[4797]: I0930 17:45:07.998927 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.059480 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-utilities\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.059561 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbmh\" (UniqueName: \"kubernetes.io/projected/1c653129-5d28-4cee-8df0-27f782045d84-kube-api-access-gfbmh\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.059661 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-catalog-content\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.160791 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.160909 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbmh\" (UniqueName: \"kubernetes.io/projected/1c653129-5d28-4cee-8df0-27f782045d84-kube-api-access-gfbmh\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.160947 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-catalog-content\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.161016 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.161088 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-utilities\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.161597 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-catalog-content\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.161787 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-utilities\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.186544 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbmh\" (UniqueName: \"kubernetes.io/projected/1c653129-5d28-4cee-8df0-27f782045d84-kube-api-access-gfbmh\") pod \"redhat-marketplace-wkwqb\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.240769 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.254796 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.262698 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.263071 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.263203 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.286278 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.314197 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.319789 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88s5q"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.320716 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.333012 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s5q"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.351544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d50bec08-3d51-4bd4-903f-c564b219b400","Type":"ContainerDied","Data":"89a49a28d4ad72aabc4ec6380bb22918370363345e162321718371e0a6ecc4db"} Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.351593 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89a49a28d4ad72aabc4ec6380bb22918370363345e162321718371e0a6ecc4db" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.351692 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.370718 4797 generic.go:334] "Generic (PLEG): container finished" podID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerID="f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961" exitCode=0 Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.371025 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerDied","Data":"f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961"} Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.417287 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" event={"ID":"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4","Type":"ContainerStarted","Data":"89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416"} Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.417337 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" event={"ID":"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4","Type":"ContainerStarted","Data":"5025af6818c33fe652568cfdba999634ad795eeda5ba03bab5c20ed22cf3b010"} Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.417646 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.443089 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" event={"ID":"b65fb47f-d3bc-40dc-8352-27edf856762c","Type":"ContainerStarted","Data":"efd033d84fa193a554d5f08e95b217bde7ed84046d6f07173a3a652ee29fa9b3"} Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.467452 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-utilities\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.467836 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-catalog-content\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.467884 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh6nk\" (UniqueName: \"kubernetes.io/projected/2c10f39f-1403-4083-abff-6dc18d4812d9-kube-api-access-xh6nk\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.463409 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" podStartSLOduration=137.463393914 podStartE2EDuration="2m17.463393914s" podCreationTimestamp="2025-09-30 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:08.460852589 +0000 UTC m=+158.983351827" watchObservedRunningTime="2025-09-30 17:45:08.463393914 +0000 UTC m=+158.985893152" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.485191 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xc7x2" podStartSLOduration=14.48517636 podStartE2EDuration="14.48517636s" podCreationTimestamp="2025-09-30 17:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:08.484960694 +0000 UTC m=+159.007459932" watchObservedRunningTime="2025-09-30 17:45:08.48517636 +0000 UTC m=+159.007675598" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.569335 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-utilities\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.569557 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-catalog-content\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.569619 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh6nk\" (UniqueName: \"kubernetes.io/projected/2c10f39f-1403-4083-abff-6dc18d4812d9-kube-api-access-xh6nk\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.570989 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-utilities\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.571299 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-catalog-content\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.581377 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wkwqb"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.587296 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh6nk\" (UniqueName: \"kubernetes.io/projected/2c10f39f-1403-4083-abff-6dc18d4812d9-kube-api-access-xh6nk\") pod \"redhat-marketplace-88s5q\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.649150 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.703614 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.706444 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:08 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:08 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:08 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.706554 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.845427 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.872591 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/542a58fe-9654-424c-90dc-6d073f486328-secret-volume\") pod \"542a58fe-9654-424c-90dc-6d073f486328\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.872659 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9q7r\" (UniqueName: \"kubernetes.io/projected/542a58fe-9654-424c-90dc-6d073f486328-kube-api-access-s9q7r\") pod \"542a58fe-9654-424c-90dc-6d073f486328\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.872721 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/542a58fe-9654-424c-90dc-6d073f486328-config-volume\") pod \"542a58fe-9654-424c-90dc-6d073f486328\" (UID: \"542a58fe-9654-424c-90dc-6d073f486328\") " Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.873946 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/542a58fe-9654-424c-90dc-6d073f486328-config-volume" (OuterVolumeSpecName: "config-volume") pod "542a58fe-9654-424c-90dc-6d073f486328" (UID: "542a58fe-9654-424c-90dc-6d073f486328"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.881773 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542a58fe-9654-424c-90dc-6d073f486328-kube-api-access-s9q7r" (OuterVolumeSpecName: "kube-api-access-s9q7r") pod "542a58fe-9654-424c-90dc-6d073f486328" (UID: "542a58fe-9654-424c-90dc-6d073f486328"). InnerVolumeSpecName "kube-api-access-s9q7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.882084 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542a58fe-9654-424c-90dc-6d073f486328-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "542a58fe-9654-424c-90dc-6d073f486328" (UID: "542a58fe-9654-424c-90dc-6d073f486328"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.915890 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7jv9j"] Sep 30 17:45:08 crc kubenswrapper[4797]: E0930 17:45:08.916288 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542a58fe-9654-424c-90dc-6d073f486328" containerName="collect-profiles" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.916298 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="542a58fe-9654-424c-90dc-6d073f486328" containerName="collect-profiles" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.916384 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="542a58fe-9654-424c-90dc-6d073f486328" containerName="collect-profiles" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.917111 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.919817 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.932334 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7jv9j"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.939378 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s5q"] Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.973983 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/542a58fe-9654-424c-90dc-6d073f486328-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.974016 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9q7r\" (UniqueName: \"kubernetes.io/projected/542a58fe-9654-424c-90dc-6d073f486328-kube-api-access-s9q7r\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:08 crc kubenswrapper[4797]: I0930 17:45:08.974027 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/542a58fe-9654-424c-90dc-6d073f486328-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.076026 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgrhj\" (UniqueName: \"kubernetes.io/projected/366f3f10-83ff-410c-add7-935d4a0811ba-kube-api-access-hgrhj\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.076297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-catalog-content\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.076365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-utilities\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.177105 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-utilities\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.177178 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgrhj\" (UniqueName: \"kubernetes.io/projected/366f3f10-83ff-410c-add7-935d4a0811ba-kube-api-access-hgrhj\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.177209 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-catalog-content\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.177713 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-catalog-content\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.178501 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-utilities\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.201336 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgrhj\" (UniqueName: \"kubernetes.io/projected/366f3f10-83ff-410c-add7-935d4a0811ba-kube-api-access-hgrhj\") pod \"redhat-operators-7jv9j\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.241090 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.320028 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lw6hl"] Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.321054 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.330319 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw6hl"] Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.364763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.459635 4797 generic.go:334] "Generic (PLEG): container finished" podID="1c653129-5d28-4cee-8df0-27f782045d84" containerID="6630b02104ef2bcfccaf536da8e9ac2d079e35a251babf3317fba7f2c3f2d3b2" exitCode=0 Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.459930 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wkwqb" event={"ID":"1c653129-5d28-4cee-8df0-27f782045d84","Type":"ContainerDied","Data":"6630b02104ef2bcfccaf536da8e9ac2d079e35a251babf3317fba7f2c3f2d3b2"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.459996 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wkwqb" event={"ID":"1c653129-5d28-4cee-8df0-27f782045d84","Type":"ContainerStarted","Data":"2df110bfd7313776475101cbfa3beb27dc70025ac703c20141387840a11d9859"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.464624 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c","Type":"ContainerStarted","Data":"017373fda3373758f296150b005b3f8bd03a7488cc39288dd22bf1f99d803c33"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.464662 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c","Type":"ContainerStarted","Data":"969d637f78b1c6295cb2a51e76c27b81612336f1798845cb22468830cf8d5d82"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.466788 4797 generic.go:334] "Generic (PLEG): container finished" podID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerID="12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e" exitCode=0 Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.466825 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerDied","Data":"12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.466840 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerStarted","Data":"cd660790485fa73c9f2a420c80008e929645c75944c4ee91362a6367016cecbf"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.469694 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.470048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr" event={"ID":"542a58fe-9654-424c-90dc-6d073f486328","Type":"ContainerDied","Data":"abec4a89d14b78d3fe11a1e5fdaae16b55845de81a18b444f4e0bf503ab04051"} Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.470074 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abec4a89d14b78d3fe11a1e5fdaae16b55845de81a18b444f4e0bf503ab04051" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.480696 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2d2\" (UniqueName: \"kubernetes.io/projected/24a63ddc-c622-4c51-80d5-933b49fd1bc7-kube-api-access-cl2d2\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.480790 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-utilities\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.480829 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-catalog-content\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.515767 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.515746437 podStartE2EDuration="2.515746437s" podCreationTimestamp="2025-09-30 17:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:45:09.513336625 +0000 UTC m=+160.035835863" watchObservedRunningTime="2025-09-30 17:45:09.515746437 +0000 UTC m=+160.038245675" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.582414 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-utilities\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.582521 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-catalog-content\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.582544 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2d2\" (UniqueName: \"kubernetes.io/projected/24a63ddc-c622-4c51-80d5-933b49fd1bc7-kube-api-access-cl2d2\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.583094 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-utilities\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.583662 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-catalog-content\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.611926 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2d2\" (UniqueName: \"kubernetes.io/projected/24a63ddc-c622-4c51-80d5-933b49fd1bc7-kube-api-access-cl2d2\") pod \"redhat-operators-lw6hl\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.641188 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.708022 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:09 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:09 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:09 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.708091 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.765991 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7jv9j"] Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.779027 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hvlqn" Sep 30 17:45:09 crc kubenswrapper[4797]: I0930 17:45:09.877912 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw6hl"] Sep 30 17:45:09 crc kubenswrapper[4797]: W0930 17:45:09.914573 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a63ddc_c622_4c51_80d5_933b49fd1bc7.slice/crio-2d4142d7b9158c2850b21ccc81d587e5d0296fafefaf26fc75e7209dd7769e2d WatchSource:0}: Error finding container 2d4142d7b9158c2850b21ccc81d587e5d0296fafefaf26fc75e7209dd7769e2d: Status 404 returned error can't find the container with id 2d4142d7b9158c2850b21ccc81d587e5d0296fafefaf26fc75e7209dd7769e2d Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.476747 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerStarted","Data":"2d4142d7b9158c2850b21ccc81d587e5d0296fafefaf26fc75e7209dd7769e2d"} Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.479614 4797 generic.go:334] "Generic (PLEG): container finished" podID="366f3f10-83ff-410c-add7-935d4a0811ba" containerID="c71315081bb68442075f81844b6ca20417b0eef0eccfdf444de086313e3fc19c" exitCode=0 Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.479793 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerDied","Data":"c71315081bb68442075f81844b6ca20417b0eef0eccfdf444de086313e3fc19c"} Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.479854 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerStarted","Data":"c4b8a1a46465f160ddf27ad9ae0c642e8b6672bf42833c9be9210f16b0550a30"} Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.488242 4797 generic.go:334] "Generic (PLEG): container finished" podID="1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c" containerID="017373fda3373758f296150b005b3f8bd03a7488cc39288dd22bf1f99d803c33" exitCode=0 Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.488305 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c","Type":"ContainerDied","Data":"017373fda3373758f296150b005b3f8bd03a7488cc39288dd22bf1f99d803c33"} Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.707489 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:10 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:10 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:10 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:10 crc kubenswrapper[4797]: I0930 17:45:10.707557 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:11 crc kubenswrapper[4797]: I0930 17:45:11.504035 4797 generic.go:334] "Generic (PLEG): container finished" podID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerID="92ab7545c03be8ed375fa88b66299f5970f9553620b729684418ffffca7b90ef" exitCode=0 Sep 30 17:45:11 crc kubenswrapper[4797]: I0930 17:45:11.504130 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerDied","Data":"92ab7545c03be8ed375fa88b66299f5970f9553620b729684418ffffca7b90ef"} Sep 30 17:45:11 crc kubenswrapper[4797]: I0930 17:45:11.709136 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:11 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:11 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:11 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:11 crc kubenswrapper[4797]: I0930 17:45:11.709379 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:11 crc kubenswrapper[4797]: I0930 17:45:11.830414 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:45:11 crc kubenswrapper[4797]: I0930 17:45:11.842404 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-sx5xp" Sep 30 17:45:12 crc kubenswrapper[4797]: I0930 17:45:12.706480 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:12 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:12 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:12 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:12 crc kubenswrapper[4797]: I0930 17:45:12.706532 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:13 crc kubenswrapper[4797]: I0930 17:45:13.708159 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:13 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:13 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:13 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:13 crc kubenswrapper[4797]: I0930 17:45:13.708242 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.192081 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.192138 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.666566 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.689270 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2fc9be6-9aff-4e05-aadc-5a81cbfea32e-metrics-certs\") pod \"network-metrics-daemon-rx9f5\" (UID: \"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e\") " pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.707142 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:14 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:14 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:14 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.707199 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:14 crc kubenswrapper[4797]: I0930 17:45:14.980198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rx9f5" Sep 30 17:45:15 crc kubenswrapper[4797]: I0930 17:45:15.706328 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:15 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:15 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:15 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:15 crc kubenswrapper[4797]: I0930 17:45:15.706393 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:16 crc kubenswrapper[4797]: I0930 17:45:16.628350 4797 patch_prober.go:28] interesting pod/console-f9d7485db-ngfnz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Sep 30 17:45:16 crc kubenswrapper[4797]: I0930 17:45:16.628726 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ngfnz" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Sep 30 17:45:16 crc kubenswrapper[4797]: I0930 17:45:16.706757 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:16 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:16 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:16 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:16 crc kubenswrapper[4797]: I0930 17:45:16.706830 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:17 crc kubenswrapper[4797]: I0930 17:45:17.265981 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:45:17 crc kubenswrapper[4797]: I0930 17:45:17.265960 4797 patch_prober.go:28] interesting pod/downloads-7954f5f757-dhl6s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Sep 30 17:45:17 crc kubenswrapper[4797]: I0930 17:45:17.266052 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:45:17 crc kubenswrapper[4797]: I0930 17:45:17.266163 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dhl6s" podUID="4255c7fa-ba59-4363-ad05-cb5e7a8628e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Sep 30 17:45:17 crc kubenswrapper[4797]: I0930 17:45:17.708471 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:17 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:17 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:17 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:17 crc kubenswrapper[4797]: I0930 17:45:17.708547 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:18 crc kubenswrapper[4797]: I0930 17:45:18.706685 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:18 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:18 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:18 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:18 crc kubenswrapper[4797]: I0930 17:45:18.706778 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:18 crc kubenswrapper[4797]: I0930 17:45:18.939525 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.140510 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kube-api-access\") pod \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.140591 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kubelet-dir\") pod \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\" (UID: \"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c\") " Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.140648 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c" (UID: "1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.140823 4797 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.146356 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c" (UID: "1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.242677 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.555734 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c","Type":"ContainerDied","Data":"969d637f78b1c6295cb2a51e76c27b81612336f1798845cb22468830cf8d5d82"} Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.555775 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969d637f78b1c6295cb2a51e76c27b81612336f1798845cb22468830cf8d5d82" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.555827 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.707764 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:19 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:19 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:19 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:19 crc kubenswrapper[4797]: I0930 17:45:19.707886 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:20 crc kubenswrapper[4797]: I0930 17:45:20.706957 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:20 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:20 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:20 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:20 crc kubenswrapper[4797]: I0930 17:45:20.707046 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:21 crc kubenswrapper[4797]: I0930 17:45:21.705685 4797 patch_prober.go:28] interesting pod/router-default-5444994796-jzbcv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:45:21 crc kubenswrapper[4797]: [-]has-synced failed: reason withheld Sep 30 17:45:21 crc kubenswrapper[4797]: [+]process-running ok Sep 30 17:45:21 crc kubenswrapper[4797]: healthz check failed Sep 30 17:45:21 crc kubenswrapper[4797]: I0930 17:45:21.705760 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jzbcv" podUID="cf3e5bac-0992-4ba7-9899-44d44f898977" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:45:22 crc kubenswrapper[4797]: I0930 17:45:22.779287 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:45:22 crc kubenswrapper[4797]: I0930 17:45:22.782348 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jzbcv" Sep 30 17:45:26 crc kubenswrapper[4797]: I0930 17:45:26.631612 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:45:26 crc kubenswrapper[4797]: I0930 17:45:26.635848 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:45:27 crc kubenswrapper[4797]: I0930 17:45:27.269622 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dhl6s" Sep 30 17:45:27 crc kubenswrapper[4797]: I0930 17:45:27.595350 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:45:37 crc kubenswrapper[4797]: I0930 17:45:37.716972 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5rg4c" Sep 30 17:45:39 crc kubenswrapper[4797]: I0930 17:45:39.583919 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:45:40 crc kubenswrapper[4797]: E0930 17:45:40.486323 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 17:45:40 crc kubenswrapper[4797]: E0930 17:45:40.486530 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmnnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vlfbv_openshift-marketplace(abfc63a5-f217-4b84-925a-91c4e900aaa8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:45:40 crc kubenswrapper[4797]: E0930 17:45:40.487733 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vlfbv" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" Sep 30 17:45:44 crc kubenswrapper[4797]: I0930 17:45:44.192192 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:45:44 crc kubenswrapper[4797]: I0930 17:45:44.192597 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:45:45 crc kubenswrapper[4797]: E0930 17:45:45.554376 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vlfbv" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" Sep 30 17:45:46 crc kubenswrapper[4797]: I0930 17:45:46.005422 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rx9f5"] Sep 30 17:45:47 crc kubenswrapper[4797]: E0930 17:45:47.568485 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 17:45:47 crc kubenswrapper[4797]: E0930 17:45:47.569101 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z9r2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d48nz_openshift-marketplace(96db7793-b3a4-46c5-889e-553d7d41ed0f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:45:47 crc kubenswrapper[4797]: E0930 17:45:47.570372 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d48nz" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" Sep 30 17:45:48 crc kubenswrapper[4797]: E0930 17:45:48.625713 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d48nz" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" Sep 30 17:45:49 crc kubenswrapper[4797]: E0930 17:45:49.812011 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 17:45:49 crc kubenswrapper[4797]: E0930 17:45:49.812147 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfbmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wkwqb_openshift-marketplace(1c653129-5d28-4cee-8df0-27f782045d84): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:45:49 crc kubenswrapper[4797]: E0930 17:45:49.813406 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wkwqb" podUID="1c653129-5d28-4cee-8df0-27f782045d84" Sep 30 17:45:51 crc kubenswrapper[4797]: E0930 17:45:51.627289 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 17:45:51 crc kubenswrapper[4797]: E0930 17:45:51.627760 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bzttz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p67c6_openshift-marketplace(623778df-ea9e-4898-8286-f200b9a29844): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:45:51 crc kubenswrapper[4797]: E0930 17:45:51.628943 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p67c6" podUID="623778df-ea9e-4898-8286-f200b9a29844" Sep 30 17:45:53 crc kubenswrapper[4797]: E0930 17:45:53.752317 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 17:45:53 crc kubenswrapper[4797]: E0930 17:45:53.752580 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8rlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dqgjz_openshift-marketplace(24b1d14d-8ba7-4862-ae38-d9dc28061c12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:45:53 crc kubenswrapper[4797]: E0930 17:45:53.753898 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dqgjz" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" Sep 30 17:45:56 crc kubenswrapper[4797]: E0930 17:45:56.650222 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p67c6" podUID="623778df-ea9e-4898-8286-f200b9a29844" Sep 30 17:45:56 crc kubenswrapper[4797]: E0930 17:45:56.650334 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wkwqb" podUID="1c653129-5d28-4cee-8df0-27f782045d84" Sep 30 17:45:56 crc kubenswrapper[4797]: E0930 17:45:56.650737 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dqgjz" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" Sep 30 17:45:56 crc kubenswrapper[4797]: W0930 17:45:56.656764 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fc9be6_9aff_4e05_aadc_5a81cbfea32e.slice/crio-85215d012c026313708d4d974ccc4860259094fff00fc18b9d796ee22566b39b WatchSource:0}: Error finding container 85215d012c026313708d4d974ccc4860259094fff00fc18b9d796ee22566b39b: Status 404 returned error can't find the container with id 85215d012c026313708d4d974ccc4860259094fff00fc18b9d796ee22566b39b Sep 30 17:45:56 crc kubenswrapper[4797]: I0930 17:45:56.825716 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" event={"ID":"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e","Type":"ContainerStarted","Data":"85215d012c026313708d4d974ccc4860259094fff00fc18b9d796ee22566b39b"} Sep 30 17:45:57 crc kubenswrapper[4797]: E0930 17:45:57.150922 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 17:45:57 crc kubenswrapper[4797]: E0930 17:45:57.151468 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xh6nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-88s5q_openshift-marketplace(2c10f39f-1403-4083-abff-6dc18d4812d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:45:57 crc kubenswrapper[4797]: E0930 17:45:57.152757 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-88s5q" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" Sep 30 17:45:57 crc kubenswrapper[4797]: I0930 17:45:57.841990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" event={"ID":"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e","Type":"ContainerStarted","Data":"e3ea1d29f3834f55dc54afda193efd77038c0c699b9a90ce933f4612cb7fc13d"} Sep 30 17:45:57 crc kubenswrapper[4797]: E0930 17:45:57.844026 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-88s5q" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" Sep 30 17:46:01 crc kubenswrapper[4797]: E0930 17:46:01.742210 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 17:46:01 crc kubenswrapper[4797]: E0930 17:46:01.742887 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgrhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7jv9j_openshift-marketplace(366f3f10-83ff-410c-add7-935d4a0811ba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:46:01 crc kubenswrapper[4797]: E0930 17:46:01.744222 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7jv9j" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" Sep 30 17:46:01 crc kubenswrapper[4797]: E0930 17:46:01.868513 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7jv9j" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" Sep 30 17:46:07 crc kubenswrapper[4797]: E0930 17:46:07.704611 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 17:46:07 crc kubenswrapper[4797]: E0930 17:46:07.705237 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cl2d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lw6hl_openshift-marketplace(24a63ddc-c622-4c51-80d5-933b49fd1bc7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:46:07 crc kubenswrapper[4797]: E0930 17:46:07.707754 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lw6hl" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" Sep 30 17:46:07 crc kubenswrapper[4797]: I0930 17:46:07.903780 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rx9f5" event={"ID":"d2fc9be6-9aff-4e05-aadc-5a81cbfea32e","Type":"ContainerStarted","Data":"c91f1a1d476b1c6ef1077d0df873ea84c6c7000a36f6094a958cb9329e09054e"} Sep 30 17:46:09 crc kubenswrapper[4797]: I0930 17:46:09.946044 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rx9f5" podStartSLOduration=199.946023627 podStartE2EDuration="3m19.946023627s" podCreationTimestamp="2025-09-30 17:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:46:09.945108492 +0000 UTC m=+220.467607790" watchObservedRunningTime="2025-09-30 17:46:09.946023627 +0000 UTC m=+220.468522875" Sep 30 17:46:14 crc kubenswrapper[4797]: I0930 17:46:14.192331 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:46:14 crc kubenswrapper[4797]: I0930 17:46:14.192430 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:46:14 crc kubenswrapper[4797]: I0930 17:46:14.192563 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:46:14 crc kubenswrapper[4797]: I0930 17:46:14.193251 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:46:14 crc kubenswrapper[4797]: I0930 17:46:14.193349 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e" gracePeriod=600 Sep 30 17:46:17 crc kubenswrapper[4797]: I0930 17:46:17.981019 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e" exitCode=0 Sep 30 17:46:17 crc kubenswrapper[4797]: I0930 17:46:17.981196 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e"} Sep 30 17:46:28 crc kubenswrapper[4797]: I0930 17:46:28.043951 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"0bb69cd4b8dc2cd622f66f312ff74c3e36c56d0368d212b2150249a3933839d5"} Sep 30 17:46:28 crc kubenswrapper[4797]: I0930 17:46:28.049347 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerStarted","Data":"c0e2ce5311775f735a3b85f2cdd0b969f7f0fcea7fb1c1d076174f83f921e31d"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.057628 4797 generic.go:334] "Generic (PLEG): container finished" podID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerID="c0e2ce5311775f735a3b85f2cdd0b969f7f0fcea7fb1c1d076174f83f921e31d" exitCode=0 Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.057877 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerDied","Data":"c0e2ce5311775f735a3b85f2cdd0b969f7f0fcea7fb1c1d076174f83f921e31d"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.060880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerStarted","Data":"f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.067379 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerStarted","Data":"29bb4446cecb0bfbb964a959d9eac60e56133be77012d4a83e1c830eaaf8b125"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.070033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerStarted","Data":"da9991aa57e907e01f0148a64158930c6e0d5950e3133a21d1e96071372a488e"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.079061 4797 generic.go:334] "Generic (PLEG): container finished" podID="1c653129-5d28-4cee-8df0-27f782045d84" containerID="d061be0790f17f465a7726b7f332589562b1311be1c72a42b3cdc4bc68f427a3" exitCode=0 Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.079148 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wkwqb" event={"ID":"1c653129-5d28-4cee-8df0-27f782045d84","Type":"ContainerDied","Data":"d061be0790f17f465a7726b7f332589562b1311be1c72a42b3cdc4bc68f427a3"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.082638 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerStarted","Data":"6edfb3491bcf273be5766548a74e1f7b58b1f5c0cd1dba122a125d89788c195e"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.087004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerStarted","Data":"f8acbbb3ef1ff95cce236c66be04970186d00f0697d70c3890cf96dc2e7722be"} Sep 30 17:46:29 crc kubenswrapper[4797]: I0930 17:46:29.090932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerStarted","Data":"d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003"} Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.100728 4797 generic.go:334] "Generic (PLEG): container finished" podID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerID="f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663" exitCode=0 Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.100804 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerDied","Data":"f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663"} Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.105185 4797 generic.go:334] "Generic (PLEG): container finished" podID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerID="29bb4446cecb0bfbb964a959d9eac60e56133be77012d4a83e1c830eaaf8b125" exitCode=0 Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.105335 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerDied","Data":"29bb4446cecb0bfbb964a959d9eac60e56133be77012d4a83e1c830eaaf8b125"} Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.112342 4797 generic.go:334] "Generic (PLEG): container finished" podID="623778df-ea9e-4898-8286-f200b9a29844" containerID="da9991aa57e907e01f0148a64158930c6e0d5950e3133a21d1e96071372a488e" exitCode=0 Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.112503 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerDied","Data":"da9991aa57e907e01f0148a64158930c6e0d5950e3133a21d1e96071372a488e"} Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.118284 4797 generic.go:334] "Generic (PLEG): container finished" podID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerID="6edfb3491bcf273be5766548a74e1f7b58b1f5c0cd1dba122a125d89788c195e" exitCode=0 Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.118349 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerDied","Data":"6edfb3491bcf273be5766548a74e1f7b58b1f5c0cd1dba122a125d89788c195e"} Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.128355 4797 generic.go:334] "Generic (PLEG): container finished" podID="366f3f10-83ff-410c-add7-935d4a0811ba" containerID="f8acbbb3ef1ff95cce236c66be04970186d00f0697d70c3890cf96dc2e7722be" exitCode=0 Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.128512 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerDied","Data":"f8acbbb3ef1ff95cce236c66be04970186d00f0697d70c3890cf96dc2e7722be"} Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.132074 4797 generic.go:334] "Generic (PLEG): container finished" podID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerID="d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003" exitCode=0 Sep 30 17:46:30 crc kubenswrapper[4797]: I0930 17:46:30.133032 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerDied","Data":"d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003"} Sep 30 17:46:31 crc kubenswrapper[4797]: I0930 17:46:31.140865 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerStarted","Data":"3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b"} Sep 30 17:46:31 crc kubenswrapper[4797]: I0930 17:46:31.165056 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vlfbv" podStartSLOduration=2.488349336 podStartE2EDuration="1m25.165035205s" podCreationTimestamp="2025-09-30 17:45:06 +0000 UTC" firstStartedPulling="2025-09-30 17:45:07.331156523 +0000 UTC m=+157.853655761" lastFinishedPulling="2025-09-30 17:46:30.007842382 +0000 UTC m=+240.530341630" observedRunningTime="2025-09-30 17:46:31.160551987 +0000 UTC m=+241.683051255" watchObservedRunningTime="2025-09-30 17:46:31.165035205 +0000 UTC m=+241.687534453" Sep 30 17:46:36 crc kubenswrapper[4797]: I0930 17:46:36.443367 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:46:36 crc kubenswrapper[4797]: I0930 17:46:36.444401 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:46:36 crc kubenswrapper[4797]: I0930 17:46:36.783124 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:46:37 crc kubenswrapper[4797]: I0930 17:46:37.248225 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:46:38 crc kubenswrapper[4797]: I0930 17:46:38.164805 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlfbv"] Sep 30 17:46:38 crc kubenswrapper[4797]: I0930 17:46:38.187065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerStarted","Data":"490f7313d5677d7e14996502158e08c1f7576c26d69f5df9d020be3d8a6b9565"} Sep 30 17:46:39 crc kubenswrapper[4797]: I0930 17:46:39.193389 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vlfbv" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="registry-server" containerID="cri-o://3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b" gracePeriod=2 Sep 30 17:46:39 crc kubenswrapper[4797]: I0930 17:46:39.225334 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7jv9j" podStartSLOduration=9.12967121 podStartE2EDuration="1m31.225315461s" podCreationTimestamp="2025-09-30 17:45:08 +0000 UTC" firstStartedPulling="2025-09-30 17:45:10.482045663 +0000 UTC m=+161.004544901" lastFinishedPulling="2025-09-30 17:46:32.577689894 +0000 UTC m=+243.100189152" observedRunningTime="2025-09-30 17:46:39.2229913 +0000 UTC m=+249.745490578" watchObservedRunningTime="2025-09-30 17:46:39.225315461 +0000 UTC m=+249.747814709" Sep 30 17:46:39 crc kubenswrapper[4797]: I0930 17:46:39.241688 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:46:39 crc kubenswrapper[4797]: I0930 17:46:39.241742 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:46:40 crc kubenswrapper[4797]: I0930 17:46:40.302819 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7jv9j" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="registry-server" probeResult="failure" output=< Sep 30 17:46:40 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 17:46:40 crc kubenswrapper[4797]: > Sep 30 17:46:43 crc kubenswrapper[4797]: I0930 17:46:43.220030 4797 generic.go:334] "Generic (PLEG): container finished" podID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerID="3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b" exitCode=0 Sep 30 17:46:43 crc kubenswrapper[4797]: I0930 17:46:43.220123 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerDied","Data":"3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b"} Sep 30 17:46:46 crc kubenswrapper[4797]: E0930 17:46:46.444316 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b is running failed: container process not found" containerID="3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:46:46 crc kubenswrapper[4797]: E0930 17:46:46.446124 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b is running failed: container process not found" containerID="3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:46:46 crc kubenswrapper[4797]: E0930 17:46:46.446657 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b is running failed: container process not found" containerID="3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:46:46 crc kubenswrapper[4797]: E0930 17:46:46.446885 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-vlfbv" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="registry-server" Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.410853 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.607101 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-catalog-content\") pod \"abfc63a5-f217-4b84-925a-91c4e900aaa8\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.607165 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmnnd\" (UniqueName: \"kubernetes.io/projected/abfc63a5-f217-4b84-925a-91c4e900aaa8-kube-api-access-jmnnd\") pod \"abfc63a5-f217-4b84-925a-91c4e900aaa8\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.607244 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-utilities\") pod \"abfc63a5-f217-4b84-925a-91c4e900aaa8\" (UID: \"abfc63a5-f217-4b84-925a-91c4e900aaa8\") " Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.608750 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-utilities" (OuterVolumeSpecName: "utilities") pod "abfc63a5-f217-4b84-925a-91c4e900aaa8" (UID: "abfc63a5-f217-4b84-925a-91c4e900aaa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.622167 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abfc63a5-f217-4b84-925a-91c4e900aaa8-kube-api-access-jmnnd" (OuterVolumeSpecName: "kube-api-access-jmnnd") pod "abfc63a5-f217-4b84-925a-91c4e900aaa8" (UID: "abfc63a5-f217-4b84-925a-91c4e900aaa8"). InnerVolumeSpecName "kube-api-access-jmnnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.710940 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmnnd\" (UniqueName: \"kubernetes.io/projected/abfc63a5-f217-4b84-925a-91c4e900aaa8-kube-api-access-jmnnd\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:48 crc kubenswrapper[4797]: I0930 17:46:48.710993 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.263298 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlfbv" event={"ID":"abfc63a5-f217-4b84-925a-91c4e900aaa8","Type":"ContainerDied","Data":"ae89bf1e13341def52c8d7ab731532b6cdc9036ba2b70ce675db9c41e4557326"} Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.263363 4797 scope.go:117] "RemoveContainer" containerID="3d6c843b0fbd3d5432a61d574fcf649025814ecfa3b8837d91d1c2c350e2cc1b" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.263578 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlfbv" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.318764 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.327218 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abfc63a5-f217-4b84-925a-91c4e900aaa8" (UID: "abfc63a5-f217-4b84-925a-91c4e900aaa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.382511 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.423420 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc63a5-f217-4b84-925a-91c4e900aaa8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.620321 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlfbv"] Sep 30 17:46:49 crc kubenswrapper[4797]: I0930 17:46:49.627296 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vlfbv"] Sep 30 17:46:50 crc kubenswrapper[4797]: I0930 17:46:50.249367 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" path="/var/lib/kubelet/pods/abfc63a5-f217-4b84-925a-91c4e900aaa8/volumes" Sep 30 17:46:51 crc kubenswrapper[4797]: I0930 17:46:51.466151 4797 scope.go:117] "RemoveContainer" containerID="c0e2ce5311775f735a3b85f2cdd0b969f7f0fcea7fb1c1d076174f83f921e31d" Sep 30 17:46:55 crc kubenswrapper[4797]: I0930 17:46:55.132013 4797 scope.go:117] "RemoveContainer" containerID="19045b39b519949eface2feb541e4f95180c57196ff72a6f898f64eaab5f77ad" Sep 30 17:46:57 crc kubenswrapper[4797]: I0930 17:46:57.344636 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerStarted","Data":"d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a"} Sep 30 17:46:57 crc kubenswrapper[4797]: I0930 17:46:57.348584 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerStarted","Data":"60b3fae6ece164d290173dc761d1d2bf7838910ba7c04214abe07dc2f34c3e6e"} Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.355893 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerStarted","Data":"d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212"} Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.358288 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerStarted","Data":"405f776972b277d7f5179817364ca14db80f1cad42ce4f2193c9d858a9830458"} Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.363504 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wkwqb" event={"ID":"1c653129-5d28-4cee-8df0-27f782045d84","Type":"ContainerStarted","Data":"4085a0433342367ad403904e2639cbf71e8da102f725fd08cab0a7f6d376afd3"} Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.365667 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerStarted","Data":"bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00"} Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.387074 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88s5q" podStartSLOduration=4.092429844 podStartE2EDuration="1m50.387057502s" podCreationTimestamp="2025-09-30 17:45:08 +0000 UTC" firstStartedPulling="2025-09-30 17:45:09.469171178 +0000 UTC m=+159.991670416" lastFinishedPulling="2025-09-30 17:46:55.763798796 +0000 UTC m=+266.286298074" observedRunningTime="2025-09-30 17:46:58.384473044 +0000 UTC m=+268.906972282" watchObservedRunningTime="2025-09-30 17:46:58.387057502 +0000 UTC m=+268.909556760" Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.403083 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p67c6" podStartSLOduration=5.5759666679999995 podStartE2EDuration="1m53.403061666s" podCreationTimestamp="2025-09-30 17:45:05 +0000 UTC" firstStartedPulling="2025-09-30 17:45:07.305721644 +0000 UTC m=+157.828220872" lastFinishedPulling="2025-09-30 17:46:55.132816592 +0000 UTC m=+265.655315870" observedRunningTime="2025-09-30 17:46:58.401033272 +0000 UTC m=+268.923532510" watchObservedRunningTime="2025-09-30 17:46:58.403061666 +0000 UTC m=+268.925560904" Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.649498 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.649556 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:46:58 crc kubenswrapper[4797]: I0930 17:46:58.704699 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:46:59 crc kubenswrapper[4797]: I0930 17:46:59.392862 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wkwqb" podStartSLOduration=26.511368672 podStartE2EDuration="1m52.392839585s" podCreationTimestamp="2025-09-30 17:45:07 +0000 UTC" firstStartedPulling="2025-09-30 17:45:09.463353529 +0000 UTC m=+159.985852767" lastFinishedPulling="2025-09-30 17:46:35.344824412 +0000 UTC m=+245.867323680" observedRunningTime="2025-09-30 17:46:59.3923031 +0000 UTC m=+269.914802348" watchObservedRunningTime="2025-09-30 17:46:59.392839585 +0000 UTC m=+269.915338833" Sep 30 17:46:59 crc kubenswrapper[4797]: I0930 17:46:59.410976 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqgjz" podStartSLOduration=7.2672231400000005 podStartE2EDuration="1m53.410954824s" podCreationTimestamp="2025-09-30 17:45:06 +0000 UTC" firstStartedPulling="2025-09-30 17:45:08.372337041 +0000 UTC m=+158.894836279" lastFinishedPulling="2025-09-30 17:46:54.516068715 +0000 UTC m=+265.038567963" observedRunningTime="2025-09-30 17:46:59.406638899 +0000 UTC m=+269.929138147" watchObservedRunningTime="2025-09-30 17:46:59.410954824 +0000 UTC m=+269.933454072" Sep 30 17:46:59 crc kubenswrapper[4797]: I0930 17:46:59.427855 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d48nz" podStartSLOduration=11.188348967 podStartE2EDuration="1m54.427830401s" podCreationTimestamp="2025-09-30 17:45:05 +0000 UTC" firstStartedPulling="2025-09-30 17:45:07.324468323 +0000 UTC m=+157.846967571" lastFinishedPulling="2025-09-30 17:46:50.563949757 +0000 UTC m=+261.086449005" observedRunningTime="2025-09-30 17:46:59.422536501 +0000 UTC m=+269.945035749" watchObservedRunningTime="2025-09-30 17:46:59.427830401 +0000 UTC m=+269.950329679" Sep 30 17:46:59 crc kubenswrapper[4797]: I0930 17:46:59.441574 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lw6hl" podStartSLOduration=6.184549462 podStartE2EDuration="1m50.441554965s" podCreationTimestamp="2025-09-30 17:45:09 +0000 UTC" firstStartedPulling="2025-09-30 17:45:11.506786922 +0000 UTC m=+162.029286160" lastFinishedPulling="2025-09-30 17:46:55.763792425 +0000 UTC m=+266.286291663" observedRunningTime="2025-09-30 17:46:59.440385863 +0000 UTC m=+269.962885121" watchObservedRunningTime="2025-09-30 17:46:59.441554965 +0000 UTC m=+269.964054223" Sep 30 17:46:59 crc kubenswrapper[4797]: I0930 17:46:59.642680 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:46:59 crc kubenswrapper[4797]: I0930 17:46:59.642756 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:47:00 crc kubenswrapper[4797]: I0930 17:47:00.697896 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lw6hl" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="registry-server" probeResult="failure" output=< Sep 30 17:47:00 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 17:47:00 crc kubenswrapper[4797]: > Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.051532 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.055526 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.107418 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.248194 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.248255 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.297012 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.464344 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.464881 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.655006 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.655467 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:47:06 crc kubenswrapper[4797]: I0930 17:47:06.694740 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:47:07 crc kubenswrapper[4797]: I0930 17:47:07.481459 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:47:08 crc kubenswrapper[4797]: I0930 17:47:08.252547 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:47:08 crc kubenswrapper[4797]: I0930 17:47:08.252611 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:47:08 crc kubenswrapper[4797]: I0930 17:47:08.283154 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:47:08 crc kubenswrapper[4797]: I0930 17:47:08.482368 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:47:08 crc kubenswrapper[4797]: I0930 17:47:08.715408 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:47:09 crc kubenswrapper[4797]: I0930 17:47:09.601207 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqgjz"] Sep 30 17:47:09 crc kubenswrapper[4797]: I0930 17:47:09.601456 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqgjz" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="registry-server" containerID="cri-o://d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212" gracePeriod=2 Sep 30 17:47:09 crc kubenswrapper[4797]: I0930 17:47:09.685920 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:47:09 crc kubenswrapper[4797]: I0930 17:47:09.725110 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.045147 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.233653 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-catalog-content\") pod \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.233797 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-utilities\") pod \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.233834 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8rlw\" (UniqueName: \"kubernetes.io/projected/24b1d14d-8ba7-4862-ae38-d9dc28061c12-kube-api-access-v8rlw\") pod \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\" (UID: \"24b1d14d-8ba7-4862-ae38-d9dc28061c12\") " Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.234662 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-utilities" (OuterVolumeSpecName: "utilities") pod "24b1d14d-8ba7-4862-ae38-d9dc28061c12" (UID: "24b1d14d-8ba7-4862-ae38-d9dc28061c12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.242595 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b1d14d-8ba7-4862-ae38-d9dc28061c12-kube-api-access-v8rlw" (OuterVolumeSpecName: "kube-api-access-v8rlw") pod "24b1d14d-8ba7-4862-ae38-d9dc28061c12" (UID: "24b1d14d-8ba7-4862-ae38-d9dc28061c12"). InnerVolumeSpecName "kube-api-access-v8rlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.280987 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24b1d14d-8ba7-4862-ae38-d9dc28061c12" (UID: "24b1d14d-8ba7-4862-ae38-d9dc28061c12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.335100 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.335141 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8rlw\" (UniqueName: \"kubernetes.io/projected/24b1d14d-8ba7-4862-ae38-d9dc28061c12-kube-api-access-v8rlw\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.335152 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b1d14d-8ba7-4862-ae38-d9dc28061c12-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.448242 4797 generic.go:334] "Generic (PLEG): container finished" podID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerID="d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212" exitCode=0 Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.448295 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgjz" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.448337 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerDied","Data":"d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212"} Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.448798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgjz" event={"ID":"24b1d14d-8ba7-4862-ae38-d9dc28061c12","Type":"ContainerDied","Data":"435f0ab0b575b0dac2ea3d3e52e2b9c148cc4c3b992112ad17ada3ca015c67f7"} Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.448829 4797 scope.go:117] "RemoveContainer" containerID="d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.467290 4797 scope.go:117] "RemoveContainer" containerID="d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.490036 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqgjz"] Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.495038 4797 scope.go:117] "RemoveContainer" containerID="f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.496128 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqgjz"] Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.511491 4797 scope.go:117] "RemoveContainer" containerID="d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212" Sep 30 17:47:10 crc kubenswrapper[4797]: E0930 17:47:10.512032 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212\": container with ID starting with d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212 not found: ID does not exist" containerID="d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.512076 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212"} err="failed to get container status \"d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212\": rpc error: code = NotFound desc = could not find container \"d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212\": container with ID starting with d4a25ba9cfeb9530932c5b5e063b18297990defbc987db296ba93fe6a8243212 not found: ID does not exist" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.512112 4797 scope.go:117] "RemoveContainer" containerID="d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003" Sep 30 17:47:10 crc kubenswrapper[4797]: E0930 17:47:10.512457 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003\": container with ID starting with d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003 not found: ID does not exist" containerID="d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.512500 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003"} err="failed to get container status \"d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003\": rpc error: code = NotFound desc = could not find container \"d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003\": container with ID starting with d0a1f4de9609ea4a76adc9aa9df075900d8ec1c9318d3c700ff91ecefce6a003 not found: ID does not exist" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.512513 4797 scope.go:117] "RemoveContainer" containerID="f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961" Sep 30 17:47:10 crc kubenswrapper[4797]: E0930 17:47:10.512837 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961\": container with ID starting with f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961 not found: ID does not exist" containerID="f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961" Sep 30 17:47:10 crc kubenswrapper[4797]: I0930 17:47:10.512973 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961"} err="failed to get container status \"f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961\": rpc error: code = NotFound desc = could not find container \"f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961\": container with ID starting with f9b771120296405b1741ad83a9cc8487a570b2e333a7549505149fc916b18961 not found: ID does not exist" Sep 30 17:47:11 crc kubenswrapper[4797]: I0930 17:47:11.810601 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s5q"] Sep 30 17:47:11 crc kubenswrapper[4797]: I0930 17:47:11.811105 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88s5q" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="registry-server" containerID="cri-o://d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a" gracePeriod=2 Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.246416 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" path="/var/lib/kubelet/pods/24b1d14d-8ba7-4862-ae38-d9dc28061c12/volumes" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.295408 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463086 4797 generic.go:334] "Generic (PLEG): container finished" podID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerID="d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a" exitCode=0 Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463179 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s5q" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463203 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerDied","Data":"d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a"} Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463514 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s5q" event={"ID":"2c10f39f-1403-4083-abff-6dc18d4812d9","Type":"ContainerDied","Data":"cd660790485fa73c9f2a420c80008e929645c75944c4ee91362a6367016cecbf"} Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463560 4797 scope.go:117] "RemoveContainer" containerID="d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463691 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-catalog-content\") pod \"2c10f39f-1403-4083-abff-6dc18d4812d9\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463842 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh6nk\" (UniqueName: \"kubernetes.io/projected/2c10f39f-1403-4083-abff-6dc18d4812d9-kube-api-access-xh6nk\") pod \"2c10f39f-1403-4083-abff-6dc18d4812d9\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.463859 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-utilities\") pod \"2c10f39f-1403-4083-abff-6dc18d4812d9\" (UID: \"2c10f39f-1403-4083-abff-6dc18d4812d9\") " Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.464683 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-utilities" (OuterVolumeSpecName: "utilities") pod "2c10f39f-1403-4083-abff-6dc18d4812d9" (UID: "2c10f39f-1403-4083-abff-6dc18d4812d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.475315 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c10f39f-1403-4083-abff-6dc18d4812d9" (UID: "2c10f39f-1403-4083-abff-6dc18d4812d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.475793 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c10f39f-1403-4083-abff-6dc18d4812d9-kube-api-access-xh6nk" (OuterVolumeSpecName: "kube-api-access-xh6nk") pod "2c10f39f-1403-4083-abff-6dc18d4812d9" (UID: "2c10f39f-1403-4083-abff-6dc18d4812d9"). InnerVolumeSpecName "kube-api-access-xh6nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.513615 4797 scope.go:117] "RemoveContainer" containerID="f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.532000 4797 scope.go:117] "RemoveContainer" containerID="12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.549653 4797 scope.go:117] "RemoveContainer" containerID="d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a" Sep 30 17:47:12 crc kubenswrapper[4797]: E0930 17:47:12.550240 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a\": container with ID starting with d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a not found: ID does not exist" containerID="d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.550276 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a"} err="failed to get container status \"d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a\": rpc error: code = NotFound desc = could not find container \"d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a\": container with ID starting with d97e9787a2356f5704b0a41e91163215b19b0b8ea4d7ce5320cdcef742680d2a not found: ID does not exist" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.550301 4797 scope.go:117] "RemoveContainer" containerID="f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663" Sep 30 17:47:12 crc kubenswrapper[4797]: E0930 17:47:12.550662 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663\": container with ID starting with f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663 not found: ID does not exist" containerID="f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.550729 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663"} err="failed to get container status \"f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663\": rpc error: code = NotFound desc = could not find container \"f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663\": container with ID starting with f603000cc3254c097bca6225ed20e22e353fa348b7a406040e6e3ecb6aac4663 not found: ID does not exist" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.550773 4797 scope.go:117] "RemoveContainer" containerID="12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e" Sep 30 17:47:12 crc kubenswrapper[4797]: E0930 17:47:12.551069 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e\": container with ID starting with 12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e not found: ID does not exist" containerID="12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.551090 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e"} err="failed to get container status \"12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e\": rpc error: code = NotFound desc = could not find container \"12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e\": container with ID starting with 12b2658fa9befd6ca48b63309fcc8ac8f79d796e52c098605ac1ea68aafafa5e not found: ID does not exist" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.564901 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh6nk\" (UniqueName: \"kubernetes.io/projected/2c10f39f-1403-4083-abff-6dc18d4812d9-kube-api-access-xh6nk\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.564927 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.564942 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c10f39f-1403-4083-abff-6dc18d4812d9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.816305 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s5q"] Sep 30 17:47:12 crc kubenswrapper[4797]: I0930 17:47:12.818744 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s5q"] Sep 30 17:47:14 crc kubenswrapper[4797]: I0930 17:47:14.198788 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw6hl"] Sep 30 17:47:14 crc kubenswrapper[4797]: I0930 17:47:14.199433 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lw6hl" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="registry-server" containerID="cri-o://405f776972b277d7f5179817364ca14db80f1cad42ce4f2193c9d858a9830458" gracePeriod=2 Sep 30 17:47:14 crc kubenswrapper[4797]: I0930 17:47:14.248064 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" path="/var/lib/kubelet/pods/2c10f39f-1403-4083-abff-6dc18d4812d9/volumes" Sep 30 17:47:14 crc kubenswrapper[4797]: I0930 17:47:14.476840 4797 generic.go:334] "Generic (PLEG): container finished" podID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerID="405f776972b277d7f5179817364ca14db80f1cad42ce4f2193c9d858a9830458" exitCode=0 Sep 30 17:47:14 crc kubenswrapper[4797]: I0930 17:47:14.476895 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerDied","Data":"405f776972b277d7f5179817364ca14db80f1cad42ce4f2193c9d858a9830458"} Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.127215 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.297671 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-catalog-content\") pod \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.299663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl2d2\" (UniqueName: \"kubernetes.io/projected/24a63ddc-c622-4c51-80d5-933b49fd1bc7-kube-api-access-cl2d2\") pod \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.299753 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-utilities\") pod \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\" (UID: \"24a63ddc-c622-4c51-80d5-933b49fd1bc7\") " Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.300954 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-utilities" (OuterVolumeSpecName: "utilities") pod "24a63ddc-c622-4c51-80d5-933b49fd1bc7" (UID: "24a63ddc-c622-4c51-80d5-933b49fd1bc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.305346 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a63ddc-c622-4c51-80d5-933b49fd1bc7-kube-api-access-cl2d2" (OuterVolumeSpecName: "kube-api-access-cl2d2") pod "24a63ddc-c622-4c51-80d5-933b49fd1bc7" (UID: "24a63ddc-c622-4c51-80d5-933b49fd1bc7"). InnerVolumeSpecName "kube-api-access-cl2d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.373454 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24a63ddc-c622-4c51-80d5-933b49fd1bc7" (UID: "24a63ddc-c622-4c51-80d5-933b49fd1bc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.401200 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.401239 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24a63ddc-c622-4c51-80d5-933b49fd1bc7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.401256 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl2d2\" (UniqueName: \"kubernetes.io/projected/24a63ddc-c622-4c51-80d5-933b49fd1bc7-kube-api-access-cl2d2\") on node \"crc\" DevicePath \"\"" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.482602 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw6hl" event={"ID":"24a63ddc-c622-4c51-80d5-933b49fd1bc7","Type":"ContainerDied","Data":"2d4142d7b9158c2850b21ccc81d587e5d0296fafefaf26fc75e7209dd7769e2d"} Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.482665 4797 scope.go:117] "RemoveContainer" containerID="405f776972b277d7f5179817364ca14db80f1cad42ce4f2193c9d858a9830458" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.482691 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw6hl" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.499013 4797 scope.go:117] "RemoveContainer" containerID="29bb4446cecb0bfbb964a959d9eac60e56133be77012d4a83e1c830eaaf8b125" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.518545 4797 scope.go:117] "RemoveContainer" containerID="92ab7545c03be8ed375fa88b66299f5970f9553620b729684418ffffca7b90ef" Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.544226 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw6hl"] Sep 30 17:47:15 crc kubenswrapper[4797]: I0930 17:47:15.555085 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lw6hl"] Sep 30 17:47:16 crc kubenswrapper[4797]: I0930 17:47:16.244466 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" path="/var/lib/kubelet/pods/24a63ddc-c622-4c51-80d5-933b49fd1bc7/volumes" Sep 30 17:47:47 crc kubenswrapper[4797]: I0930 17:47:47.356078 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kpfg6"] Sep 30 17:48:12 crc kubenswrapper[4797]: I0930 17:48:12.388931 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" containerName="oauth-openshift" containerID="cri-o://21b43fad5edf41571d16f910352ea52e7a7202fa91b343d4b3bf2bc95d175bb8" gracePeriod=15 Sep 30 17:48:12 crc kubenswrapper[4797]: I0930 17:48:12.820661 4797 generic.go:334] "Generic (PLEG): container finished" podID="58496a63-6105-4e38-b1b0-f91a7276121e" containerID="21b43fad5edf41571d16f910352ea52e7a7202fa91b343d4b3bf2bc95d175bb8" exitCode=0 Sep 30 17:48:12 crc kubenswrapper[4797]: I0930 17:48:12.820825 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" event={"ID":"58496a63-6105-4e38-b1b0-f91a7276121e","Type":"ContainerDied","Data":"21b43fad5edf41571d16f910352ea52e7a7202fa91b343d4b3bf2bc95d175bb8"} Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.600852 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641148 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj"] Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641319 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641329 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641338 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641344 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641352 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641358 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641365 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641370 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641541 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c" containerName="pruner" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641549 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c" containerName="pruner" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641741 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641748 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641757 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641763 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641770 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641776 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641785 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641790 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="extract-utilities" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641816 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641824 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641833 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" containerName="oauth-openshift" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641839 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" containerName="oauth-openshift" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641846 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641851 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="extract-content" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641860 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641865 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: E0930 17:48:13.641873 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.641897 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642000 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="abfc63a5-f217-4b84-925a-91c4e900aaa8" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642014 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b1d14d-8ba7-4862-ae38-d9dc28061c12" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642022 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a63ddc-c622-4c51-80d5-933b49fd1bc7" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642028 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" containerName="oauth-openshift" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642060 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa29841-4bf9-42cf-be99-fdb1fa4a8f1c" containerName="pruner" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642069 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c10f39f-1403-4083-abff-6dc18d4812d9" containerName="registry-server" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.642506 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.658694 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj"] Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.703840 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-error\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.703906 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-cliconfig\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.703948 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-session\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704096 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-ocp-branding-template\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704543 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6f8\" (UniqueName: \"kubernetes.io/projected/58496a63-6105-4e38-b1b0-f91a7276121e-kube-api-access-qt6f8\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704632 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-login\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704665 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-router-certs\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704696 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704718 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-provider-selection\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704749 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-audit-policies\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704774 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-idp-0-file-data\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704822 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58496a63-6105-4e38-b1b0-f91a7276121e-audit-dir\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704847 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-service-ca\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704878 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-serving-cert\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.704903 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-trusted-ca-bundle\") pod \"58496a63-6105-4e38-b1b0-f91a7276121e\" (UID: \"58496a63-6105-4e38-b1b0-f91a7276121e\") " Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705115 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-audit-dir\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705157 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-session\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705253 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wjn\" (UniqueName: \"kubernetes.io/projected/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-kube-api-access-z6wjn\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705284 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705327 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-audit-policies\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705390 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-login\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705412 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705456 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705501 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705531 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705558 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705630 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-error\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705671 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.705736 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58496a63-6105-4e38-b1b0-f91a7276121e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.707260 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.708127 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.708517 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.710416 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58496a63-6105-4e38-b1b0-f91a7276121e-kube-api-access-qt6f8" (OuterVolumeSpecName: "kube-api-access-qt6f8") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "kube-api-access-qt6f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.718529 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.718805 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.719173 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.719558 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.720350 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.720831 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.721109 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.723694 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "58496a63-6105-4e38-b1b0-f91a7276121e" (UID: "58496a63-6105-4e38-b1b0-f91a7276121e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.806485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.806656 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-audit-policies\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.808369 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-audit-policies\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.808548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-login\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.808655 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.808700 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.808749 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.809544 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.809601 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.809703 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-error\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.809770 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-audit-dir\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.809966 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-session\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810034 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810181 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wjn\" (UniqueName: \"kubernetes.io/projected/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-kube-api-access-z6wjn\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810249 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-audit-dir\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810347 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810382 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810395 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810413 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6f8\" (UniqueName: \"kubernetes.io/projected/58496a63-6105-4e38-b1b0-f91a7276121e-kube-api-access-qt6f8\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810538 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810604 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810571 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.810999 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811034 4797 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811062 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811091 4797 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58496a63-6105-4e38-b1b0-f91a7276121e-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811118 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811147 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811177 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811205 4797 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58496a63-6105-4e38-b1b0-f91a7276121e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.811788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.812766 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-login\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.812965 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.814587 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.815941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.816121 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.817167 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-session\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.817201 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-user-template-error\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.818970 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.831205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" event={"ID":"58496a63-6105-4e38-b1b0-f91a7276121e","Type":"ContainerDied","Data":"0e732097a43e1d93ccbd71d6b62b602854262d959f2de5927dec61cce7ca8aac"} Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.831243 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kpfg6" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.831263 4797 scope.go:117] "RemoveContainer" containerID="21b43fad5edf41571d16f910352ea52e7a7202fa91b343d4b3bf2bc95d175bb8" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.835107 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wjn\" (UniqueName: \"kubernetes.io/projected/7fcdcf09-b21b-44af-ab78-23d1bd8a5c45-kube-api-access-z6wjn\") pod \"oauth-openshift-9fbfc7dc4-d89rj\" (UID: \"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45\") " pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.861325 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kpfg6"] Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.868677 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kpfg6"] Sep 30 17:48:13 crc kubenswrapper[4797]: I0930 17:48:13.972130 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.228190 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj"] Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.244204 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58496a63-6105-4e38-b1b0-f91a7276121e" path="/var/lib/kubelet/pods/58496a63-6105-4e38-b1b0-f91a7276121e/volumes" Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.840710 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" event={"ID":"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45","Type":"ContainerStarted","Data":"9630c7f1fa06708e0a72e23f248760650f41e4db1f5bfb0b47a557a855714ef9"} Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.841161 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" event={"ID":"7fcdcf09-b21b-44af-ab78-23d1bd8a5c45","Type":"ContainerStarted","Data":"f1ab7d716b63a967743bede77d3c187c7f0a2a99f342adf1d65f607ab63b1aca"} Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.841204 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.849847 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" Sep 30 17:48:14 crc kubenswrapper[4797]: I0930 17:48:14.872187 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9fbfc7dc4-d89rj" podStartSLOduration=27.87215798 podStartE2EDuration="27.87215798s" podCreationTimestamp="2025-09-30 17:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:48:14.870775694 +0000 UTC m=+345.393275022" watchObservedRunningTime="2025-09-30 17:48:14.87215798 +0000 UTC m=+345.394657258" Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.887235 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p67c6"] Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.889292 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p67c6" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="registry-server" containerID="cri-o://60b3fae6ece164d290173dc761d1d2bf7838910ba7c04214abe07dc2f34c3e6e" gracePeriod=30 Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.902776 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d48nz"] Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.903006 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d48nz" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="registry-server" containerID="cri-o://bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00" gracePeriod=30 Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.919736 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhrrh"] Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.920364 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" containerID="cri-o://6bfac2f32579b96057c8a528b7e7756b3cd278dc471dc9b6f16aea0ce1bde4b2" gracePeriod=30 Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.925762 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wkwqb"] Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.926000 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wkwqb" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="registry-server" containerID="cri-o://4085a0433342367ad403904e2639cbf71e8da102f725fd08cab0a7f6d376afd3" gracePeriod=30 Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.939133 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7jv9j"] Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.939344 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7jv9j" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="registry-server" containerID="cri-o://490f7313d5677d7e14996502158e08c1f7576c26d69f5df9d020be3d8a6b9565" gracePeriod=30 Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.949993 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qh78"] Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.950582 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:23 crc kubenswrapper[4797]: I0930 17:48:23.968738 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qh78"] Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.056284 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvdgf\" (UniqueName: \"kubernetes.io/projected/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-kube-api-access-gvdgf\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.056372 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.056420 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.157191 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvdgf\" (UniqueName: \"kubernetes.io/projected/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-kube-api-access-gvdgf\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.157556 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.159788 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.162254 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.165129 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.177975 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvdgf\" (UniqueName: \"kubernetes.io/projected/e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2-kube-api-access-gvdgf\") pod \"marketplace-operator-79b997595-7qh78\" (UID: \"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2\") " pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.281729 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.475004 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7qh78"] Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.925503 4797 generic.go:334] "Generic (PLEG): container finished" podID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerID="6bfac2f32579b96057c8a528b7e7756b3cd278dc471dc9b6f16aea0ce1bde4b2" exitCode=0 Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.925719 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" event={"ID":"03a9880c-d077-47c8-b93a-d96cf7dced9c","Type":"ContainerDied","Data":"6bfac2f32579b96057c8a528b7e7756b3cd278dc471dc9b6f16aea0ce1bde4b2"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.929481 4797 generic.go:334] "Generic (PLEG): container finished" podID="623778df-ea9e-4898-8286-f200b9a29844" containerID="60b3fae6ece164d290173dc761d1d2bf7838910ba7c04214abe07dc2f34c3e6e" exitCode=0 Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.929578 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerDied","Data":"60b3fae6ece164d290173dc761d1d2bf7838910ba7c04214abe07dc2f34c3e6e"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.933955 4797 generic.go:334] "Generic (PLEG): container finished" podID="1c653129-5d28-4cee-8df0-27f782045d84" containerID="4085a0433342367ad403904e2639cbf71e8da102f725fd08cab0a7f6d376afd3" exitCode=0 Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.934027 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wkwqb" event={"ID":"1c653129-5d28-4cee-8df0-27f782045d84","Type":"ContainerDied","Data":"4085a0433342367ad403904e2639cbf71e8da102f725fd08cab0a7f6d376afd3"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.938327 4797 generic.go:334] "Generic (PLEG): container finished" podID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerID="bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00" exitCode=0 Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.938397 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerDied","Data":"bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.941042 4797 generic.go:334] "Generic (PLEG): container finished" podID="366f3f10-83ff-410c-add7-935d4a0811ba" containerID="490f7313d5677d7e14996502158e08c1f7576c26d69f5df9d020be3d8a6b9565" exitCode=0 Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.941111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerDied","Data":"490f7313d5677d7e14996502158e08c1f7576c26d69f5df9d020be3d8a6b9565"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.942989 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" event={"ID":"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2","Type":"ContainerStarted","Data":"0db6cfcd5991f885b611d06770ce3ba3ca814c5ec585c9b87beac1e07718c05c"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.943024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" event={"ID":"e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2","Type":"ContainerStarted","Data":"3725c17f6bc92e7122aa5f5873b3e202ce3c558f81097333e48d55f23bd1ec5d"} Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.944828 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.946751 4797 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7qh78 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Sep 30 17:48:24 crc kubenswrapper[4797]: I0930 17:48:24.946812 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" podUID="e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.024986 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.046240 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" podStartSLOduration=2.046221948 podStartE2EDuration="2.046221948s" podCreationTimestamp="2025-09-30 17:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:48:24.979816779 +0000 UTC m=+355.502316107" watchObservedRunningTime="2025-09-30 17:48:25.046221948 +0000 UTC m=+355.568721186" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.185733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-catalog-content\") pod \"366f3f10-83ff-410c-add7-935d4a0811ba\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.185859 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-utilities\") pod \"366f3f10-83ff-410c-add7-935d4a0811ba\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.186055 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgrhj\" (UniqueName: \"kubernetes.io/projected/366f3f10-83ff-410c-add7-935d4a0811ba-kube-api-access-hgrhj\") pod \"366f3f10-83ff-410c-add7-935d4a0811ba\" (UID: \"366f3f10-83ff-410c-add7-935d4a0811ba\") " Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.186577 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-utilities" (OuterVolumeSpecName: "utilities") pod "366f3f10-83ff-410c-add7-935d4a0811ba" (UID: "366f3f10-83ff-410c-add7-935d4a0811ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.191012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366f3f10-83ff-410c-add7-935d4a0811ba-kube-api-access-hgrhj" (OuterVolumeSpecName: "kube-api-access-hgrhj") pod "366f3f10-83ff-410c-add7-935d4a0811ba" (UID: "366f3f10-83ff-410c-add7-935d4a0811ba"). InnerVolumeSpecName "kube-api-access-hgrhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.276370 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "366f3f10-83ff-410c-add7-935d4a0811ba" (UID: "366f3f10-83ff-410c-add7-935d4a0811ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.288192 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgrhj\" (UniqueName: \"kubernetes.io/projected/366f3f10-83ff-410c-add7-935d4a0811ba-kube-api-access-hgrhj\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.288236 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.288247 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366f3f10-83ff-410c-add7-935d4a0811ba-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.765602 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.896897 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-operator-metrics\") pod \"03a9880c-d077-47c8-b93a-d96cf7dced9c\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.896977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q88sg\" (UniqueName: \"kubernetes.io/projected/03a9880c-d077-47c8-b93a-d96cf7dced9c-kube-api-access-q88sg\") pod \"03a9880c-d077-47c8-b93a-d96cf7dced9c\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.897055 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-trusted-ca\") pod \"03a9880c-d077-47c8-b93a-d96cf7dced9c\" (UID: \"03a9880c-d077-47c8-b93a-d96cf7dced9c\") " Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.897729 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "03a9880c-d077-47c8-b93a-d96cf7dced9c" (UID: "03a9880c-d077-47c8-b93a-d96cf7dced9c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.910738 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "03a9880c-d077-47c8-b93a-d96cf7dced9c" (UID: "03a9880c-d077-47c8-b93a-d96cf7dced9c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.912400 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a9880c-d077-47c8-b93a-d96cf7dced9c-kube-api-access-q88sg" (OuterVolumeSpecName: "kube-api-access-q88sg") pod "03a9880c-d077-47c8-b93a-d96cf7dced9c" (UID: "03a9880c-d077-47c8-b93a-d96cf7dced9c"). InnerVolumeSpecName "kube-api-access-q88sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.934486 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.962597 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" event={"ID":"03a9880c-d077-47c8-b93a-d96cf7dced9c","Type":"ContainerDied","Data":"e3a5ea64eba9f2bdda7e35696c65717d62c0308d3fbf420cad387c7b03ae6362"} Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.962644 4797 scope.go:117] "RemoveContainer" containerID="6bfac2f32579b96057c8a528b7e7756b3cd278dc471dc9b6f16aea0ce1bde4b2" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.962748 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dhrrh" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.970183 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67c6" event={"ID":"623778df-ea9e-4898-8286-f200b9a29844","Type":"ContainerDied","Data":"5bc89ab0a7e78359ffcc31cd8f0dd75875f3bde0265b9c67c8cb2668e17d3308"} Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.970343 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67c6" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.981286 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jv9j" event={"ID":"366f3f10-83ff-410c-add7-935d4a0811ba","Type":"ContainerDied","Data":"c4b8a1a46465f160ddf27ad9ae0c642e8b6672bf42833c9be9210f16b0550a30"} Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.981381 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jv9j" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.988505 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7qh78" Sep 30 17:48:25 crc kubenswrapper[4797]: I0930 17:48:25.988526 4797 scope.go:117] "RemoveContainer" containerID="60b3fae6ece164d290173dc761d1d2bf7838910ba7c04214abe07dc2f34c3e6e" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.002316 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.002349 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q88sg\" (UniqueName: \"kubernetes.io/projected/03a9880c-d077-47c8-b93a-d96cf7dced9c-kube-api-access-q88sg\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.002364 4797 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03a9880c-d077-47c8-b93a-d96cf7dced9c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.011237 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhrrh"] Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.014675 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhrrh"] Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.033182 4797 scope.go:117] "RemoveContainer" containerID="da9991aa57e907e01f0148a64158930c6e0d5950e3133a21d1e96071372a488e" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.044960 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7jv9j"] Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.049513 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7jv9j"] Sep 30 17:48:26 crc kubenswrapper[4797]: E0930 17:48:26.051872 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00 is running failed: container process not found" containerID="bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:48:26 crc kubenswrapper[4797]: E0930 17:48:26.052560 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00 is running failed: container process not found" containerID="bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:48:26 crc kubenswrapper[4797]: E0930 17:48:26.053638 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00 is running failed: container process not found" containerID="bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:48:26 crc kubenswrapper[4797]: E0930 17:48:26.053697 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-d48nz" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="registry-server" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.063629 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.076670 4797 scope.go:117] "RemoveContainer" containerID="f811ea69af862d2dc0bba1895c62d8b4bead7250a14810833c23f0bdbe020012" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.088374 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.098233 4797 scope.go:117] "RemoveContainer" containerID="490f7313d5677d7e14996502158e08c1f7576c26d69f5df9d020be3d8a6b9565" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.109506 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-catalog-content\") pod \"623778df-ea9e-4898-8286-f200b9a29844\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.109595 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-utilities\") pod \"623778df-ea9e-4898-8286-f200b9a29844\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.109765 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-utilities\") pod \"1c653129-5d28-4cee-8df0-27f782045d84\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.109796 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfbmh\" (UniqueName: \"kubernetes.io/projected/1c653129-5d28-4cee-8df0-27f782045d84-kube-api-access-gfbmh\") pod \"1c653129-5d28-4cee-8df0-27f782045d84\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.109817 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzttz\" (UniqueName: \"kubernetes.io/projected/623778df-ea9e-4898-8286-f200b9a29844-kube-api-access-bzttz\") pod \"623778df-ea9e-4898-8286-f200b9a29844\" (UID: \"623778df-ea9e-4898-8286-f200b9a29844\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.109833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-catalog-content\") pod \"1c653129-5d28-4cee-8df0-27f782045d84\" (UID: \"1c653129-5d28-4cee-8df0-27f782045d84\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.110526 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-utilities" (OuterVolumeSpecName: "utilities") pod "623778df-ea9e-4898-8286-f200b9a29844" (UID: "623778df-ea9e-4898-8286-f200b9a29844"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.113130 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-utilities" (OuterVolumeSpecName: "utilities") pod "1c653129-5d28-4cee-8df0-27f782045d84" (UID: "1c653129-5d28-4cee-8df0-27f782045d84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.113485 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c653129-5d28-4cee-8df0-27f782045d84-kube-api-access-gfbmh" (OuterVolumeSpecName: "kube-api-access-gfbmh") pod "1c653129-5d28-4cee-8df0-27f782045d84" (UID: "1c653129-5d28-4cee-8df0-27f782045d84"). InnerVolumeSpecName "kube-api-access-gfbmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.114619 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623778df-ea9e-4898-8286-f200b9a29844-kube-api-access-bzttz" (OuterVolumeSpecName: "kube-api-access-bzttz") pod "623778df-ea9e-4898-8286-f200b9a29844" (UID: "623778df-ea9e-4898-8286-f200b9a29844"). InnerVolumeSpecName "kube-api-access-bzttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.124037 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c653129-5d28-4cee-8df0-27f782045d84" (UID: "1c653129-5d28-4cee-8df0-27f782045d84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.125734 4797 scope.go:117] "RemoveContainer" containerID="f8acbbb3ef1ff95cce236c66be04970186d00f0697d70c3890cf96dc2e7722be" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.144653 4797 scope.go:117] "RemoveContainer" containerID="c71315081bb68442075f81844b6ca20417b0eef0eccfdf444de086313e3fc19c" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.159506 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "623778df-ea9e-4898-8286-f200b9a29844" (UID: "623778df-ea9e-4898-8286-f200b9a29844"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.210919 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-catalog-content\") pod \"96db7793-b3a4-46c5-889e-553d7d41ed0f\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.210982 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-utilities\") pod \"96db7793-b3a4-46c5-889e-553d7d41ed0f\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211077 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9r2t\" (UniqueName: \"kubernetes.io/projected/96db7793-b3a4-46c5-889e-553d7d41ed0f-kube-api-access-z9r2t\") pod \"96db7793-b3a4-46c5-889e-553d7d41ed0f\" (UID: \"96db7793-b3a4-46c5-889e-553d7d41ed0f\") " Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211282 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211301 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623778df-ea9e-4898-8286-f200b9a29844-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211312 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211322 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfbmh\" (UniqueName: \"kubernetes.io/projected/1c653129-5d28-4cee-8df0-27f782045d84-kube-api-access-gfbmh\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211332 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzttz\" (UniqueName: \"kubernetes.io/projected/623778df-ea9e-4898-8286-f200b9a29844-kube-api-access-bzttz\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211342 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c653129-5d28-4cee-8df0-27f782045d84-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.211689 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-utilities" (OuterVolumeSpecName: "utilities") pod "96db7793-b3a4-46c5-889e-553d7d41ed0f" (UID: "96db7793-b3a4-46c5-889e-553d7d41ed0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.213563 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96db7793-b3a4-46c5-889e-553d7d41ed0f-kube-api-access-z9r2t" (OuterVolumeSpecName: "kube-api-access-z9r2t") pod "96db7793-b3a4-46c5-889e-553d7d41ed0f" (UID: "96db7793-b3a4-46c5-889e-553d7d41ed0f"). InnerVolumeSpecName "kube-api-access-z9r2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.244105 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" path="/var/lib/kubelet/pods/03a9880c-d077-47c8-b93a-d96cf7dced9c/volumes" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.244722 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" path="/var/lib/kubelet/pods/366f3f10-83ff-410c-add7-935d4a0811ba/volumes" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.263467 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96db7793-b3a4-46c5-889e-553d7d41ed0f" (UID: "96db7793-b3a4-46c5-889e-553d7d41ed0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.293381 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p67c6"] Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.299814 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p67c6"] Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.312798 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9r2t\" (UniqueName: \"kubernetes.io/projected/96db7793-b3a4-46c5-889e-553d7d41ed0f-kube-api-access-z9r2t\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.312849 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.312860 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96db7793-b3a4-46c5-889e-553d7d41ed0f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.992932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wkwqb" event={"ID":"1c653129-5d28-4cee-8df0-27f782045d84","Type":"ContainerDied","Data":"2df110bfd7313776475101cbfa3beb27dc70025ac703c20141387840a11d9859"} Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.993001 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wkwqb" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.993025 4797 scope.go:117] "RemoveContainer" containerID="4085a0433342367ad403904e2639cbf71e8da102f725fd08cab0a7f6d376afd3" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.999211 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d48nz" Sep 30 17:48:26 crc kubenswrapper[4797]: I0930 17:48:26.999520 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d48nz" event={"ID":"96db7793-b3a4-46c5-889e-553d7d41ed0f","Type":"ContainerDied","Data":"23f56a354ca325ec1e5e3983a5a4c242b7d82a57b12d74a2b5a4742f3d6beebb"} Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.028210 4797 scope.go:117] "RemoveContainer" containerID="d061be0790f17f465a7726b7f332589562b1311be1c72a42b3cdc4bc68f427a3" Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.030375 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wkwqb"] Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.034957 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wkwqb"] Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.086631 4797 scope.go:117] "RemoveContainer" containerID="6630b02104ef2bcfccaf536da8e9ac2d079e35a251babf3317fba7f2c3f2d3b2" Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.120288 4797 scope.go:117] "RemoveContainer" containerID="bd0723b1056235dda0859adedd957805f860faf28397a7ac3544f2125df5fc00" Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.127619 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d48nz"] Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.132508 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d48nz"] Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.159282 4797 scope.go:117] "RemoveContainer" containerID="6edfb3491bcf273be5766548a74e1f7b58b1f5c0cd1dba122a125d89788c195e" Sep 30 17:48:27 crc kubenswrapper[4797]: I0930 17:48:27.182872 4797 scope.go:117] "RemoveContainer" containerID="7415f2de609de0360f364043822f1d55e9b693faf606ce1d3403ac9810949b56" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.122585 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8gwld"] Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123397 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123431 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123490 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123508 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123527 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123544 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123568 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123583 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123606 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123620 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123647 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123662 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123694 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123711 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123735 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123750 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123776 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123795 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123816 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123833 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123863 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123879 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123900 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123916 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="extract-content" Sep 30 17:48:28 crc kubenswrapper[4797]: E0930 17:48:28.123943 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.123960 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="extract-utilities" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.124186 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c653129-5d28-4cee-8df0-27f782045d84" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.124215 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="623778df-ea9e-4898-8286-f200b9a29844" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.124234 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.124257 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a9880c-d077-47c8-b93a-d96cf7dced9c" containerName="marketplace-operator" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.124274 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="366f3f10-83ff-410c-add7-935d4a0811ba" containerName="registry-server" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.125986 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.128229 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gwld"] Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.130428 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.144240 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7f1c4-357b-4a48-b0b0-71088e564851-utilities\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.144519 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfr6f\" (UniqueName: \"kubernetes.io/projected/42c7f1c4-357b-4a48-b0b0-71088e564851-kube-api-access-zfr6f\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.144649 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7f1c4-357b-4a48-b0b0-71088e564851-catalog-content\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.245717 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7f1c4-357b-4a48-b0b0-71088e564851-utilities\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.245822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfr6f\" (UniqueName: \"kubernetes.io/projected/42c7f1c4-357b-4a48-b0b0-71088e564851-kube-api-access-zfr6f\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.245975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7f1c4-357b-4a48-b0b0-71088e564851-catalog-content\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.246311 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7f1c4-357b-4a48-b0b0-71088e564851-utilities\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.246630 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7f1c4-357b-4a48-b0b0-71088e564851-catalog-content\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.266352 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c653129-5d28-4cee-8df0-27f782045d84" path="/var/lib/kubelet/pods/1c653129-5d28-4cee-8df0-27f782045d84/volumes" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.268415 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623778df-ea9e-4898-8286-f200b9a29844" path="/var/lib/kubelet/pods/623778df-ea9e-4898-8286-f200b9a29844/volumes" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.270622 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfr6f\" (UniqueName: \"kubernetes.io/projected/42c7f1c4-357b-4a48-b0b0-71088e564851-kube-api-access-zfr6f\") pod \"certified-operators-8gwld\" (UID: \"42c7f1c4-357b-4a48-b0b0-71088e564851\") " pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.285788 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96db7793-b3a4-46c5-889e-553d7d41ed0f" path="/var/lib/kubelet/pods/96db7793-b3a4-46c5-889e-553d7d41ed0f/volumes" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.311833 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8pxk9"] Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.314163 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.316310 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.322670 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8pxk9"] Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.346430 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e54563-9315-4e3e-9527-6d2849b83ee3-catalog-content\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.346661 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psskd\" (UniqueName: \"kubernetes.io/projected/b8e54563-9315-4e3e-9527-6d2849b83ee3-kube-api-access-psskd\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.346803 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e54563-9315-4e3e-9527-6d2849b83ee3-utilities\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.447766 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e54563-9315-4e3e-9527-6d2849b83ee3-catalog-content\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.447854 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psskd\" (UniqueName: \"kubernetes.io/projected/b8e54563-9315-4e3e-9527-6d2849b83ee3-kube-api-access-psskd\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.447961 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e54563-9315-4e3e-9527-6d2849b83ee3-utilities\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.448565 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e54563-9315-4e3e-9527-6d2849b83ee3-catalog-content\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.448740 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e54563-9315-4e3e-9527-6d2849b83ee3-utilities\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.463062 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.477404 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psskd\" (UniqueName: \"kubernetes.io/projected/b8e54563-9315-4e3e-9527-6d2849b83ee3-kube-api-access-psskd\") pod \"community-operators-8pxk9\" (UID: \"b8e54563-9315-4e3e-9527-6d2849b83ee3\") " pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.646728 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.686039 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gwld"] Sep 30 17:48:28 crc kubenswrapper[4797]: W0930 17:48:28.692703 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c7f1c4_357b_4a48_b0b0_71088e564851.slice/crio-e6f50931a00671deb55b4dc1581540e529092e2e544778bbc4afded47349fadc WatchSource:0}: Error finding container e6f50931a00671deb55b4dc1581540e529092e2e544778bbc4afded47349fadc: Status 404 returned error can't find the container with id e6f50931a00671deb55b4dc1581540e529092e2e544778bbc4afded47349fadc Sep 30 17:48:28 crc kubenswrapper[4797]: I0930 17:48:28.901296 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8pxk9"] Sep 30 17:48:28 crc kubenswrapper[4797]: W0930 17:48:28.915409 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8e54563_9315_4e3e_9527_6d2849b83ee3.slice/crio-3ca7a1ecd2a12c9cc6fa05f1abb384c0f2bbf363a8fabd2fc4678edc68d62eb1 WatchSource:0}: Error finding container 3ca7a1ecd2a12c9cc6fa05f1abb384c0f2bbf363a8fabd2fc4678edc68d62eb1: Status 404 returned error can't find the container with id 3ca7a1ecd2a12c9cc6fa05f1abb384c0f2bbf363a8fabd2fc4678edc68d62eb1 Sep 30 17:48:29 crc kubenswrapper[4797]: I0930 17:48:29.021354 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gwld" event={"ID":"42c7f1c4-357b-4a48-b0b0-71088e564851","Type":"ContainerStarted","Data":"e6f50931a00671deb55b4dc1581540e529092e2e544778bbc4afded47349fadc"} Sep 30 17:48:29 crc kubenswrapper[4797]: I0930 17:48:29.022739 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxk9" event={"ID":"b8e54563-9315-4e3e-9527-6d2849b83ee3","Type":"ContainerStarted","Data":"3ca7a1ecd2a12c9cc6fa05f1abb384c0f2bbf363a8fabd2fc4678edc68d62eb1"} Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.040885 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gwld" event={"ID":"42c7f1c4-357b-4a48-b0b0-71088e564851","Type":"ContainerStarted","Data":"9b57d467e31b418205fe3ef9e85663edc66ed3d3b0ddb2248eb642963faca2ad"} Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.509594 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8k7hb"] Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.512036 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.515509 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.519805 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k7hb"] Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.678514 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580f4a26-2d39-479e-8815-185e094f1469-catalog-content\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.678805 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580f4a26-2d39-479e-8815-185e094f1469-utilities\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.678875 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6sp\" (UniqueName: \"kubernetes.io/projected/580f4a26-2d39-479e-8815-185e094f1469-kube-api-access-5k6sp\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.710252 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgrpv"] Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.711475 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.715023 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.718049 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgrpv"] Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.780290 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6sp\" (UniqueName: \"kubernetes.io/projected/580f4a26-2d39-479e-8815-185e094f1469-kube-api-access-5k6sp\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.780519 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580f4a26-2d39-479e-8815-185e094f1469-catalog-content\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.780604 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580f4a26-2d39-479e-8815-185e094f1469-utilities\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.781408 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580f4a26-2d39-479e-8815-185e094f1469-utilities\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.781556 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580f4a26-2d39-479e-8815-185e094f1469-catalog-content\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.798341 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6sp\" (UniqueName: \"kubernetes.io/projected/580f4a26-2d39-479e-8815-185e094f1469-kube-api-access-5k6sp\") pod \"redhat-marketplace-8k7hb\" (UID: \"580f4a26-2d39-479e-8815-185e094f1469\") " pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.881778 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b07c611-8431-4db0-b22d-89f8e391c90f-utilities\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.881920 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b07c611-8431-4db0-b22d-89f8e391c90f-catalog-content\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.881993 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhfp4\" (UniqueName: \"kubernetes.io/projected/1b07c611-8431-4db0-b22d-89f8e391c90f-kube-api-access-dhfp4\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.945201 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.983359 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b07c611-8431-4db0-b22d-89f8e391c90f-utilities\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.983954 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b07c611-8431-4db0-b22d-89f8e391c90f-catalog-content\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.983879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b07c611-8431-4db0-b22d-89f8e391c90f-utilities\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.984037 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhfp4\" (UniqueName: \"kubernetes.io/projected/1b07c611-8431-4db0-b22d-89f8e391c90f-kube-api-access-dhfp4\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:30 crc kubenswrapper[4797]: I0930 17:48:30.984397 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b07c611-8431-4db0-b22d-89f8e391c90f-catalog-content\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.002314 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhfp4\" (UniqueName: \"kubernetes.io/projected/1b07c611-8431-4db0-b22d-89f8e391c90f-kube-api-access-dhfp4\") pod \"redhat-operators-zgrpv\" (UID: \"1b07c611-8431-4db0-b22d-89f8e391c90f\") " pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.030173 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.051605 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxk9" event={"ID":"b8e54563-9315-4e3e-9527-6d2849b83ee3","Type":"ContainerDied","Data":"a3e194daa06e52a1ff14e73cd77cfa71b2dba1c079714bd4a87785d1e5baf5ac"} Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.052197 4797 generic.go:334] "Generic (PLEG): container finished" podID="b8e54563-9315-4e3e-9527-6d2849b83ee3" containerID="a3e194daa06e52a1ff14e73cd77cfa71b2dba1c079714bd4a87785d1e5baf5ac" exitCode=0 Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.055020 4797 generic.go:334] "Generic (PLEG): container finished" podID="42c7f1c4-357b-4a48-b0b0-71088e564851" containerID="9b57d467e31b418205fe3ef9e85663edc66ed3d3b0ddb2248eb642963faca2ad" exitCode=0 Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.055085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gwld" event={"ID":"42c7f1c4-357b-4a48-b0b0-71088e564851","Type":"ContainerDied","Data":"9b57d467e31b418205fe3ef9e85663edc66ed3d3b0ddb2248eb642963faca2ad"} Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.156697 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k7hb"] Sep 30 17:48:31 crc kubenswrapper[4797]: I0930 17:48:31.217897 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgrpv"] Sep 30 17:48:32 crc kubenswrapper[4797]: I0930 17:48:32.065298 4797 generic.go:334] "Generic (PLEG): container finished" podID="580f4a26-2d39-479e-8815-185e094f1469" containerID="1161b338d01ca3b2f7d89fd6d71ba5a134487eaf06c57819f703215e02bc2279" exitCode=0 Sep 30 17:48:32 crc kubenswrapper[4797]: I0930 17:48:32.065403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k7hb" event={"ID":"580f4a26-2d39-479e-8815-185e094f1469","Type":"ContainerDied","Data":"1161b338d01ca3b2f7d89fd6d71ba5a134487eaf06c57819f703215e02bc2279"} Sep 30 17:48:32 crc kubenswrapper[4797]: I0930 17:48:32.065804 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k7hb" event={"ID":"580f4a26-2d39-479e-8815-185e094f1469","Type":"ContainerStarted","Data":"3adc61eb313e291d6a83d5a81231d57e430fc02515882c25296b97f865d786b0"} Sep 30 17:48:32 crc kubenswrapper[4797]: I0930 17:48:32.069678 4797 generic.go:334] "Generic (PLEG): container finished" podID="1b07c611-8431-4db0-b22d-89f8e391c90f" containerID="e99dd40c4207a152a9e02a29e877e08b941034d70c1a8f9c940c3c3bd53421e8" exitCode=0 Sep 30 17:48:32 crc kubenswrapper[4797]: I0930 17:48:32.069715 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgrpv" event={"ID":"1b07c611-8431-4db0-b22d-89f8e391c90f","Type":"ContainerDied","Data":"e99dd40c4207a152a9e02a29e877e08b941034d70c1a8f9c940c3c3bd53421e8"} Sep 30 17:48:32 crc kubenswrapper[4797]: I0930 17:48:32.069738 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgrpv" event={"ID":"1b07c611-8431-4db0-b22d-89f8e391c90f","Type":"ContainerStarted","Data":"b07eaa430bed075b2744b82a73dd4feb096722e35b688391c23907bdf732fef6"} Sep 30 17:48:37 crc kubenswrapper[4797]: I0930 17:48:37.103054 4797 generic.go:334] "Generic (PLEG): container finished" podID="580f4a26-2d39-479e-8815-185e094f1469" containerID="a858d998ff6fcd147394a51a1d3dd3a6ae7658e58df2adccf1bb0087bc33f2eb" exitCode=0 Sep 30 17:48:37 crc kubenswrapper[4797]: I0930 17:48:37.103097 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k7hb" event={"ID":"580f4a26-2d39-479e-8815-185e094f1469","Type":"ContainerDied","Data":"a858d998ff6fcd147394a51a1d3dd3a6ae7658e58df2adccf1bb0087bc33f2eb"} Sep 30 17:48:37 crc kubenswrapper[4797]: I0930 17:48:37.106691 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgrpv" event={"ID":"1b07c611-8431-4db0-b22d-89f8e391c90f","Type":"ContainerStarted","Data":"5d3386e959c53c1a3f45bb512d796efcfaae333de6debdfc35d3d0f1665ca398"} Sep 30 17:48:37 crc kubenswrapper[4797]: I0930 17:48:37.108302 4797 generic.go:334] "Generic (PLEG): container finished" podID="b8e54563-9315-4e3e-9527-6d2849b83ee3" containerID="5df57eb23e9fc5767381de8ee29bf04e35b7810c149521c38d34fd53c378cd09" exitCode=0 Sep 30 17:48:37 crc kubenswrapper[4797]: I0930 17:48:37.108373 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxk9" event={"ID":"b8e54563-9315-4e3e-9527-6d2849b83ee3","Type":"ContainerDied","Data":"5df57eb23e9fc5767381de8ee29bf04e35b7810c149521c38d34fd53c378cd09"} Sep 30 17:48:37 crc kubenswrapper[4797]: I0930 17:48:37.110300 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gwld" event={"ID":"42c7f1c4-357b-4a48-b0b0-71088e564851","Type":"ContainerStarted","Data":"3454c2d9b937ed45ee762c209f574ff9d89ce5bdf426d25e03534292e6a7a88e"} Sep 30 17:48:38 crc kubenswrapper[4797]: I0930 17:48:38.125056 4797 generic.go:334] "Generic (PLEG): container finished" podID="1b07c611-8431-4db0-b22d-89f8e391c90f" containerID="5d3386e959c53c1a3f45bb512d796efcfaae333de6debdfc35d3d0f1665ca398" exitCode=0 Sep 30 17:48:38 crc kubenswrapper[4797]: I0930 17:48:38.125415 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgrpv" event={"ID":"1b07c611-8431-4db0-b22d-89f8e391c90f","Type":"ContainerDied","Data":"5d3386e959c53c1a3f45bb512d796efcfaae333de6debdfc35d3d0f1665ca398"} Sep 30 17:48:38 crc kubenswrapper[4797]: I0930 17:48:38.132908 4797 generic.go:334] "Generic (PLEG): container finished" podID="42c7f1c4-357b-4a48-b0b0-71088e564851" containerID="3454c2d9b937ed45ee762c209f574ff9d89ce5bdf426d25e03534292e6a7a88e" exitCode=0 Sep 30 17:48:38 crc kubenswrapper[4797]: I0930 17:48:38.132949 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gwld" event={"ID":"42c7f1c4-357b-4a48-b0b0-71088e564851","Type":"ContainerDied","Data":"3454c2d9b937ed45ee762c209f574ff9d89ce5bdf426d25e03534292e6a7a88e"} Sep 30 17:48:40 crc kubenswrapper[4797]: I0930 17:48:40.146030 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gwld" event={"ID":"42c7f1c4-357b-4a48-b0b0-71088e564851","Type":"ContainerStarted","Data":"4e1f3431af3bc33f2241574cfdc593d319c006f598f242b962fcd83bdf9d62dc"} Sep 30 17:48:40 crc kubenswrapper[4797]: I0930 17:48:40.165957 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8gwld" podStartSLOduration=3.845074001 podStartE2EDuration="12.1659416s" podCreationTimestamp="2025-09-30 17:48:28 +0000 UTC" firstStartedPulling="2025-09-30 17:48:31.063679503 +0000 UTC m=+361.586178751" lastFinishedPulling="2025-09-30 17:48:39.384547102 +0000 UTC m=+369.907046350" observedRunningTime="2025-09-30 17:48:40.163958888 +0000 UTC m=+370.686458126" watchObservedRunningTime="2025-09-30 17:48:40.1659416 +0000 UTC m=+370.688440838" Sep 30 17:48:42 crc kubenswrapper[4797]: I0930 17:48:42.159739 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k7hb" event={"ID":"580f4a26-2d39-479e-8815-185e094f1469","Type":"ContainerStarted","Data":"32c05c09e05f9753ef34cdf7e5263a3b940a083619c21d0aadb86f02c6c29cf3"} Sep 30 17:48:42 crc kubenswrapper[4797]: I0930 17:48:42.163188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgrpv" event={"ID":"1b07c611-8431-4db0-b22d-89f8e391c90f","Type":"ContainerStarted","Data":"5cd7f2fa2af1c442c59aec3bd667674e5bad823eb80f033ccf07619cd5475c16"} Sep 30 17:48:42 crc kubenswrapper[4797]: I0930 17:48:42.165715 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8pxk9" event={"ID":"b8e54563-9315-4e3e-9527-6d2849b83ee3","Type":"ContainerStarted","Data":"c0c4a4a35865afc65a242fc43a920d33e0af7a4b27c69366d256914d2e1401e7"} Sep 30 17:48:42 crc kubenswrapper[4797]: I0930 17:48:42.196839 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8k7hb" podStartSLOduration=3.7136129159999998 podStartE2EDuration="12.196817046s" podCreationTimestamp="2025-09-30 17:48:30 +0000 UTC" firstStartedPulling="2025-09-30 17:48:32.066856025 +0000 UTC m=+362.589355273" lastFinishedPulling="2025-09-30 17:48:40.550060125 +0000 UTC m=+371.072559403" observedRunningTime="2025-09-30 17:48:42.191184996 +0000 UTC m=+372.713684334" watchObservedRunningTime="2025-09-30 17:48:42.196817046 +0000 UTC m=+372.719316304" Sep 30 17:48:42 crc kubenswrapper[4797]: I0930 17:48:42.215534 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgrpv" podStartSLOduration=2.683046838 podStartE2EDuration="12.215510271s" podCreationTimestamp="2025-09-30 17:48:30 +0000 UTC" firstStartedPulling="2025-09-30 17:48:32.080790665 +0000 UTC m=+362.603289893" lastFinishedPulling="2025-09-30 17:48:41.613254088 +0000 UTC m=+372.135753326" observedRunningTime="2025-09-30 17:48:42.210076477 +0000 UTC m=+372.732575735" watchObservedRunningTime="2025-09-30 17:48:42.215510271 +0000 UTC m=+372.738009509" Sep 30 17:48:42 crc kubenswrapper[4797]: I0930 17:48:42.229762 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8pxk9" podStartSLOduration=4.324304134 podStartE2EDuration="14.229743148s" podCreationTimestamp="2025-09-30 17:48:28 +0000 UTC" firstStartedPulling="2025-09-30 17:48:31.05342177 +0000 UTC m=+361.575921008" lastFinishedPulling="2025-09-30 17:48:40.958860764 +0000 UTC m=+371.481360022" observedRunningTime="2025-09-30 17:48:42.226448741 +0000 UTC m=+372.748947989" watchObservedRunningTime="2025-09-30 17:48:42.229743148 +0000 UTC m=+372.752242386" Sep 30 17:48:44 crc kubenswrapper[4797]: I0930 17:48:44.192692 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:48:44 crc kubenswrapper[4797]: I0930 17:48:44.193002 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:48:48 crc kubenswrapper[4797]: I0930 17:48:48.463525 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:48 crc kubenswrapper[4797]: I0930 17:48:48.463581 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:48 crc kubenswrapper[4797]: I0930 17:48:48.505079 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:48 crc kubenswrapper[4797]: I0930 17:48:48.647921 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:48 crc kubenswrapper[4797]: I0930 17:48:48.647980 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:48 crc kubenswrapper[4797]: I0930 17:48:48.722522 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:49 crc kubenswrapper[4797]: I0930 17:48:49.263737 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8pxk9" Sep 30 17:48:49 crc kubenswrapper[4797]: I0930 17:48:49.265019 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8gwld" Sep 30 17:48:50 crc kubenswrapper[4797]: I0930 17:48:50.946505 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:50 crc kubenswrapper[4797]: I0930 17:48:50.947790 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:51 crc kubenswrapper[4797]: I0930 17:48:51.016871 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:48:51 crc kubenswrapper[4797]: I0930 17:48:51.030944 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:51 crc kubenswrapper[4797]: I0930 17:48:51.031021 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:51 crc kubenswrapper[4797]: I0930 17:48:51.113017 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:51 crc kubenswrapper[4797]: I0930 17:48:51.289844 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgrpv" Sep 30 17:48:51 crc kubenswrapper[4797]: I0930 17:48:51.302616 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8k7hb" Sep 30 17:49:14 crc kubenswrapper[4797]: I0930 17:49:14.192751 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:49:14 crc kubenswrapper[4797]: I0930 17:49:14.193376 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.251785 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l2fwv"] Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.253829 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.265506 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l2fwv"] Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.446999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447062 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50c7701-4231-48ba-96ae-2ee94d7edb7b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447090 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50c7701-4231-48ba-96ae-2ee94d7edb7b-trusted-ca\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50c7701-4231-48ba-96ae-2ee94d7edb7b-registry-certificates\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447145 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-bound-sa-token\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447170 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50c7701-4231-48ba-96ae-2ee94d7edb7b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jhr\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-kube-api-access-p2jhr\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.447237 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-registry-tls\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.468508 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.548698 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-bound-sa-token\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.548829 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50c7701-4231-48ba-96ae-2ee94d7edb7b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.548909 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jhr\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-kube-api-access-p2jhr\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.548984 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-registry-tls\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.549112 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50c7701-4231-48ba-96ae-2ee94d7edb7b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.549219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50c7701-4231-48ba-96ae-2ee94d7edb7b-trusted-ca\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.549426 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50c7701-4231-48ba-96ae-2ee94d7edb7b-registry-certificates\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.550873 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c50c7701-4231-48ba-96ae-2ee94d7edb7b-registry-certificates\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.550932 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c50c7701-4231-48ba-96ae-2ee94d7edb7b-trusted-ca\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.551210 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c50c7701-4231-48ba-96ae-2ee94d7edb7b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.556290 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-registry-tls\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.560859 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c50c7701-4231-48ba-96ae-2ee94d7edb7b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.571505 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jhr\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-kube-api-access-p2jhr\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.573112 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c50c7701-4231-48ba-96ae-2ee94d7edb7b-bound-sa-token\") pod \"image-registry-66df7c8f76-l2fwv\" (UID: \"c50c7701-4231-48ba-96ae-2ee94d7edb7b\") " pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:24 crc kubenswrapper[4797]: I0930 17:49:24.871876 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:25 crc kubenswrapper[4797]: I0930 17:49:25.332931 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l2fwv"] Sep 30 17:49:25 crc kubenswrapper[4797]: I0930 17:49:25.502538 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" event={"ID":"c50c7701-4231-48ba-96ae-2ee94d7edb7b","Type":"ContainerStarted","Data":"ec7f4ea7312f3e5c85c32d1bc5d62c018023cf219904fba52cfab6d1f6d127f1"} Sep 30 17:49:26 crc kubenswrapper[4797]: I0930 17:49:26.511246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" event={"ID":"c50c7701-4231-48ba-96ae-2ee94d7edb7b","Type":"ContainerStarted","Data":"59cc1f476e3e57c32f95ce2ab8e24347963c0e3805f734e1246df2c92ef1579d"} Sep 30 17:49:26 crc kubenswrapper[4797]: I0930 17:49:26.511694 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:26 crc kubenswrapper[4797]: I0930 17:49:26.544020 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" podStartSLOduration=2.5439866970000002 podStartE2EDuration="2.543986697s" podCreationTimestamp="2025-09-30 17:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:49:26.537429755 +0000 UTC m=+417.059929063" watchObservedRunningTime="2025-09-30 17:49:26.543986697 +0000 UTC m=+417.066485965" Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.192066 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.192812 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.192889 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.193910 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0bb69cd4b8dc2cd622f66f312ff74c3e36c56d0368d212b2150249a3933839d5"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.194000 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://0bb69cd4b8dc2cd622f66f312ff74c3e36c56d0368d212b2150249a3933839d5" gracePeriod=600 Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.658351 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="0bb69cd4b8dc2cd622f66f312ff74c3e36c56d0368d212b2150249a3933839d5" exitCode=0 Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.658475 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"0bb69cd4b8dc2cd622f66f312ff74c3e36c56d0368d212b2150249a3933839d5"} Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.659093 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"7dd2c6a54e085cc3d196b4b71ebef6b3b0176665a4983ddfa820e07318e310ae"} Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.659202 4797 scope.go:117] "RemoveContainer" containerID="a82de850a359021b81d0cb1b75e5918e36ade86b8c89638bf008d9036487502e" Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.877972 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l2fwv" Sep 30 17:49:44 crc kubenswrapper[4797]: I0930 17:49:44.939769 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-shscm"] Sep 30 17:50:09 crc kubenswrapper[4797]: I0930 17:50:09.983383 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" podUID="7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" containerName="registry" containerID="cri-o://89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416" gracePeriod=30 Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.375149 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.567924 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.567963 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2vxl\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-kube-api-access-k2vxl\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.568014 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-ca-trust-extracted\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.568035 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-trusted-ca\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.568079 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-bound-sa-token\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.568100 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-tls\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.568133 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-certificates\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.568167 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-installation-pull-secrets\") pod \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\" (UID: \"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4\") " Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.569187 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.569469 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.574900 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.575040 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.576883 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-kube-api-access-k2vxl" (OuterVolumeSpecName: "kube-api-access-k2vxl") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "kube-api-access-k2vxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.577085 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.586596 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.593338 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" (UID: "7b643afa-b83b-4ad0-9d8e-d950d5fba3e4"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670087 4797 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670140 4797 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670153 4797 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670170 4797 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670183 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2vxl\" (UniqueName: \"kubernetes.io/projected/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-kube-api-access-k2vxl\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670198 4797 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.670213 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.832555 4797 generic.go:334] "Generic (PLEG): container finished" podID="7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" containerID="89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416" exitCode=0 Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.832607 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" event={"ID":"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4","Type":"ContainerDied","Data":"89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416"} Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.832641 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" event={"ID":"7b643afa-b83b-4ad0-9d8e-d950d5fba3e4","Type":"ContainerDied","Data":"5025af6818c33fe652568cfdba999634ad795eeda5ba03bab5c20ed22cf3b010"} Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.832665 4797 scope.go:117] "RemoveContainer" containerID="89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.832711 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-shscm" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.847709 4797 scope.go:117] "RemoveContainer" containerID="89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416" Sep 30 17:50:10 crc kubenswrapper[4797]: E0930 17:50:10.848098 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416\": container with ID starting with 89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416 not found: ID does not exist" containerID="89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.848148 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416"} err="failed to get container status \"89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416\": rpc error: code = NotFound desc = could not find container \"89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416\": container with ID starting with 89919ab79a6aa6b3c61ab271048ebb42d15d7b57245d748cfa883df86daed416 not found: ID does not exist" Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.877599 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-shscm"] Sep 30 17:50:10 crc kubenswrapper[4797]: I0930 17:50:10.884311 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-shscm"] Sep 30 17:50:12 crc kubenswrapper[4797]: I0930 17:50:12.249226 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" path="/var/lib/kubelet/pods/7b643afa-b83b-4ad0-9d8e-d950d5fba3e4/volumes" Sep 30 17:51:44 crc kubenswrapper[4797]: I0930 17:51:44.191735 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:51:44 crc kubenswrapper[4797]: I0930 17:51:44.192390 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:52:14 crc kubenswrapper[4797]: I0930 17:52:14.192324 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:52:14 crc kubenswrapper[4797]: I0930 17:52:14.193054 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.192591 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.193304 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.193372 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.194297 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dd2c6a54e085cc3d196b4b71ebef6b3b0176665a4983ddfa820e07318e310ae"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.194404 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://7dd2c6a54e085cc3d196b4b71ebef6b3b0176665a4983ddfa820e07318e310ae" gracePeriod=600 Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.880417 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="7dd2c6a54e085cc3d196b4b71ebef6b3b0176665a4983ddfa820e07318e310ae" exitCode=0 Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.880481 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"7dd2c6a54e085cc3d196b4b71ebef6b3b0176665a4983ddfa820e07318e310ae"} Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.880904 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"ce8c5c7ec3f2afcaab85363de392623ca1e51f3441e1f3e66b88c01887d0f151"} Sep 30 17:52:44 crc kubenswrapper[4797]: I0930 17:52:44.880939 4797 scope.go:117] "RemoveContainer" containerID="0bb69cd4b8dc2cd622f66f312ff74c3e36c56d0368d212b2150249a3933839d5" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.177320 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qj8wf"] Sep 30 17:53:37 crc kubenswrapper[4797]: E0930 17:53:37.178231 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" containerName="registry" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.178243 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" containerName="registry" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.178350 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b643afa-b83b-4ad0-9d8e-d950d5fba3e4" containerName="registry" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.178771 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.181101 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-l4vfs" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.182002 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.182085 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.207991 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2wj9r"] Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.209389 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2wj9r" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.210987 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-77l4k"] Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.212049 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.213986 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-rj6qb" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.216043 4797 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-5sr45" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.252371 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qj8wf"] Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.255848 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-77l4k"] Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.264838 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2wj9r"] Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.296007 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxmzk\" (UniqueName: \"kubernetes.io/projected/696a605e-f3af-4acb-941d-22aa927ba890-kube-api-access-rxmzk\") pod \"cert-manager-cainjector-7f985d654d-qj8wf\" (UID: \"696a605e-f3af-4acb-941d-22aa927ba890\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.397008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwhv\" (UniqueName: \"kubernetes.io/projected/9d1c6f31-57b5-4629-b6aa-abcf3394a4f4-kube-api-access-jrwhv\") pod \"cert-manager-5b446d88c5-2wj9r\" (UID: \"9d1c6f31-57b5-4629-b6aa-abcf3394a4f4\") " pod="cert-manager/cert-manager-5b446d88c5-2wj9r" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.397165 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxmzk\" (UniqueName: \"kubernetes.io/projected/696a605e-f3af-4acb-941d-22aa927ba890-kube-api-access-rxmzk\") pod \"cert-manager-cainjector-7f985d654d-qj8wf\" (UID: \"696a605e-f3af-4acb-941d-22aa927ba890\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.397208 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjx8\" (UniqueName: \"kubernetes.io/projected/ddd09658-492a-4954-8d85-7c05dbe4b5c4-kube-api-access-9gjx8\") pod \"cert-manager-webhook-5655c58dd6-77l4k\" (UID: \"ddd09658-492a-4954-8d85-7c05dbe4b5c4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.423010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxmzk\" (UniqueName: \"kubernetes.io/projected/696a605e-f3af-4acb-941d-22aa927ba890-kube-api-access-rxmzk\") pod \"cert-manager-cainjector-7f985d654d-qj8wf\" (UID: \"696a605e-f3af-4acb-941d-22aa927ba890\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.498901 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwhv\" (UniqueName: \"kubernetes.io/projected/9d1c6f31-57b5-4629-b6aa-abcf3394a4f4-kube-api-access-jrwhv\") pod \"cert-manager-5b446d88c5-2wj9r\" (UID: \"9d1c6f31-57b5-4629-b6aa-abcf3394a4f4\") " pod="cert-manager/cert-manager-5b446d88c5-2wj9r" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.499116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjx8\" (UniqueName: \"kubernetes.io/projected/ddd09658-492a-4954-8d85-7c05dbe4b5c4-kube-api-access-9gjx8\") pod \"cert-manager-webhook-5655c58dd6-77l4k\" (UID: \"ddd09658-492a-4954-8d85-7c05dbe4b5c4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.499184 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.525168 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjx8\" (UniqueName: \"kubernetes.io/projected/ddd09658-492a-4954-8d85-7c05dbe4b5c4-kube-api-access-9gjx8\") pod \"cert-manager-webhook-5655c58dd6-77l4k\" (UID: \"ddd09658-492a-4954-8d85-7c05dbe4b5c4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.526681 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwhv\" (UniqueName: \"kubernetes.io/projected/9d1c6f31-57b5-4629-b6aa-abcf3394a4f4-kube-api-access-jrwhv\") pod \"cert-manager-5b446d88c5-2wj9r\" (UID: \"9d1c6f31-57b5-4629-b6aa-abcf3394a4f4\") " pod="cert-manager/cert-manager-5b446d88c5-2wj9r" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.532099 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2wj9r" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.539955 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.851866 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-77l4k"] Sep 30 17:53:37 crc kubenswrapper[4797]: I0930 17:53:37.859672 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:53:38 crc kubenswrapper[4797]: I0930 17:53:38.007188 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qj8wf"] Sep 30 17:53:38 crc kubenswrapper[4797]: W0930 17:53:38.013175 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696a605e_f3af_4acb_941d_22aa927ba890.slice/crio-201a5c9202efcb8348eff5aa02a55d162e1e820ddb90b693181b4d036831c3cc WatchSource:0}: Error finding container 201a5c9202efcb8348eff5aa02a55d162e1e820ddb90b693181b4d036831c3cc: Status 404 returned error can't find the container with id 201a5c9202efcb8348eff5aa02a55d162e1e820ddb90b693181b4d036831c3cc Sep 30 17:53:38 crc kubenswrapper[4797]: I0930 17:53:38.020703 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2wj9r"] Sep 30 17:53:38 crc kubenswrapper[4797]: W0930 17:53:38.027107 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d1c6f31_57b5_4629_b6aa_abcf3394a4f4.slice/crio-db3130876c3d906244d0efaa436aeb77fbba6186bd7de3ecbb4aae01057a2b37 WatchSource:0}: Error finding container db3130876c3d906244d0efaa436aeb77fbba6186bd7de3ecbb4aae01057a2b37: Status 404 returned error can't find the container with id db3130876c3d906244d0efaa436aeb77fbba6186bd7de3ecbb4aae01057a2b37 Sep 30 17:53:38 crc kubenswrapper[4797]: I0930 17:53:38.247224 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" event={"ID":"ddd09658-492a-4954-8d85-7c05dbe4b5c4","Type":"ContainerStarted","Data":"a567132c96f83dce27f7fc0ba25266609591f1e7fbdefb38dee157ff365b2812"} Sep 30 17:53:38 crc kubenswrapper[4797]: I0930 17:53:38.248606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2wj9r" event={"ID":"9d1c6f31-57b5-4629-b6aa-abcf3394a4f4","Type":"ContainerStarted","Data":"db3130876c3d906244d0efaa436aeb77fbba6186bd7de3ecbb4aae01057a2b37"} Sep 30 17:53:38 crc kubenswrapper[4797]: I0930 17:53:38.248650 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" event={"ID":"696a605e-f3af-4acb-941d-22aa927ba890","Type":"ContainerStarted","Data":"201a5c9202efcb8348eff5aa02a55d162e1e820ddb90b693181b4d036831c3cc"} Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.272945 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" event={"ID":"ddd09658-492a-4954-8d85-7c05dbe4b5c4","Type":"ContainerStarted","Data":"1ea556dfeb922a268d5cf3d03f94240d15afe39de6cedffa4a714f91d86c13e3"} Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.273540 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.275856 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2wj9r" event={"ID":"9d1c6f31-57b5-4629-b6aa-abcf3394a4f4","Type":"ContainerStarted","Data":"568bdbd779ecb4bfb8d4eb07e5b47ed856c32287373b50269a2f80d5950dcf67"} Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.279420 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" event={"ID":"696a605e-f3af-4acb-941d-22aa927ba890","Type":"ContainerStarted","Data":"1f4a68036a274136739b096a543b34ab75ee101f3fd0daa02b71101fb9b839d3"} Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.295721 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" podStartSLOduration=1.840263852 podStartE2EDuration="5.295686949s" podCreationTimestamp="2025-09-30 17:53:37 +0000 UTC" firstStartedPulling="2025-09-30 17:53:37.859285787 +0000 UTC m=+668.381785015" lastFinishedPulling="2025-09-30 17:53:41.314708874 +0000 UTC m=+671.837208112" observedRunningTime="2025-09-30 17:53:42.292946417 +0000 UTC m=+672.815445725" watchObservedRunningTime="2025-09-30 17:53:42.295686949 +0000 UTC m=+672.818186217" Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.312822 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qj8wf" podStartSLOduration=1.951269309 podStartE2EDuration="5.312791815s" podCreationTimestamp="2025-09-30 17:53:37 +0000 UTC" firstStartedPulling="2025-09-30 17:53:38.021634082 +0000 UTC m=+668.544133340" lastFinishedPulling="2025-09-30 17:53:41.383156608 +0000 UTC m=+671.905655846" observedRunningTime="2025-09-30 17:53:42.308167325 +0000 UTC m=+672.830666653" watchObservedRunningTime="2025-09-30 17:53:42.312791815 +0000 UTC m=+672.835291063" Sep 30 17:53:42 crc kubenswrapper[4797]: I0930 17:53:42.328358 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-2wj9r" podStartSLOduration=2.042402857 podStartE2EDuration="5.328324084s" podCreationTimestamp="2025-09-30 17:53:37 +0000 UTC" firstStartedPulling="2025-09-30 17:53:38.028758147 +0000 UTC m=+668.551257395" lastFinishedPulling="2025-09-30 17:53:41.314679384 +0000 UTC m=+671.837178622" observedRunningTime="2025-09-30 17:53:42.326377055 +0000 UTC m=+672.848876333" watchObservedRunningTime="2025-09-30 17:53:42.328324084 +0000 UTC m=+672.850823352" Sep 30 17:53:47 crc kubenswrapper[4797]: I0930 17:53:47.544550 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-77l4k" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.557628 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g447b"] Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.558462 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-controller" containerID="cri-o://04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.558591 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.558716 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="northd" containerID="cri-o://918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.558839 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-node" containerID="cri-o://ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.558927 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="sbdb" containerID="cri-o://212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.559012 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="nbdb" containerID="cri-o://9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.560582 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-acl-logging" containerID="cri-o://7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.624080 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" containerID="cri-o://ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" gracePeriod=30 Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.630716 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.632375 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.635116 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.635241 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.849030 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd is running failed: container process not found" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.849142 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 is running failed: container process not found" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.849566 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd is running failed: container process not found" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.849690 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 is running failed: container process not found" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.850247 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd is running failed: container process not found" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.850271 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 is running failed: container process not found" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.850293 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="sbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.850313 4797 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="nbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.896343 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/3.log" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.899838 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovn-acl-logging/0.log" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.900817 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovn-controller/0.log" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.901553 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970251 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c9fqh"] Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970488 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="northd" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970502 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="northd" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970514 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970524 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970537 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="nbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970544 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="nbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970558 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-acl-logging" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970566 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-acl-logging" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970578 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970585 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970595 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970602 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970612 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="sbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970620 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="sbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970628 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-node" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970638 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-node" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970650 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970659 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970670 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kubecfg-setup" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970678 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kubecfg-setup" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970692 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970699 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.970710 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970718 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970849 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970859 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="northd" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970868 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970876 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="nbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970887 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="sbdb" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970900 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970909 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970919 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-node" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970930 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovn-acl-logging" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970939 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.970947 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:53:53 crc kubenswrapper[4797]: E0930 17:53:53.971063 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.971071 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.971185 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" containerName="ovnkube-controller" Sep 30 17:53:53 crc kubenswrapper[4797]: I0930 17:53:53.973041 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037314 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-etc-openvswitch\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037378 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-netns\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037415 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-ovn\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037478 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-systemd-units\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037492 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037496 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037536 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037552 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-node-log\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037572 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037698 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-node-log" (OuterVolumeSpecName: "node-log") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037800 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-bin\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037893 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.037979 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-var-lib-openvswitch\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.038070 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.038169 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c749a60-66ac-44d6-955f-a3d050b12758-ovn-node-metrics-cert\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039574 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t72jb\" (UniqueName: \"kubernetes.io/projected/4c749a60-66ac-44d6-955f-a3d050b12758-kube-api-access-t72jb\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039660 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-env-overrides\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039721 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-openvswitch\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039775 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-ovn-kubernetes\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039789 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039825 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-log-socket\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039869 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-netd\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039884 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039918 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039937 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-log-socket" (OuterVolumeSpecName: "log-socket") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039970 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-kubelet\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.039988 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040009 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040006 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040053 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-script-lib\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040149 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-systemd\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040207 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-slash\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040258 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-config\") pod \"4c749a60-66ac-44d6-955f-a3d050b12758\" (UID: \"4c749a60-66ac-44d6-955f-a3d050b12758\") " Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040322 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040328 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-slash" (OuterVolumeSpecName: "host-slash") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040735 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040846 4797 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040884 4797 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040910 4797 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040935 4797 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.040994 4797 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041019 4797 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041041 4797 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041065 4797 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041090 4797 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041112 4797 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041164 4797 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041188 4797 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041210 4797 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041220 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041232 4797 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.041296 4797 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.047060 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c749a60-66ac-44d6-955f-a3d050b12758-kube-api-access-t72jb" (OuterVolumeSpecName: "kube-api-access-t72jb") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "kube-api-access-t72jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.047601 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c749a60-66ac-44d6-955f-a3d050b12758-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.063620 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4c749a60-66ac-44d6-955f-a3d050b12758" (UID: "4c749a60-66ac-44d6-955f-a3d050b12758"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.142963 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-slash\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143068 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-env-overrides\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143160 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-etc-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143209 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-systemd-units\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143408 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-node-log\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143494 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-systemd\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143538 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-kubelet\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.143584 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-var-lib-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-cni-netd\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144212 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwfs\" (UniqueName: \"kubernetes.io/projected/81d31a8e-91ef-4e32-b508-13461317fe44-kube-api-access-lzwfs\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144315 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-run-netns\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144408 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-ovnkube-script-lib\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144530 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144668 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-cni-bin\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144710 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-run-ovn-kubernetes\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.144869 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-log-socket\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145012 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-ovn\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145141 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/81d31a8e-91ef-4e32-b508-13461317fe44-ovn-node-metrics-cert\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145249 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-ovnkube-config\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145329 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145486 4797 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c749a60-66ac-44d6-955f-a3d050b12758-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145513 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145539 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c749a60-66ac-44d6-955f-a3d050b12758-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145558 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t72jb\" (UniqueName: \"kubernetes.io/projected/4c749a60-66ac-44d6-955f-a3d050b12758-kube-api-access-t72jb\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.145584 4797 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c749a60-66ac-44d6-955f-a3d050b12758-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246549 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-var-lib-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-cni-netd\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246648 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-run-netns\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246678 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwfs\" (UniqueName: \"kubernetes.io/projected/81d31a8e-91ef-4e32-b508-13461317fe44-kube-api-access-lzwfs\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246721 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-ovnkube-script-lib\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246754 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246758 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-var-lib-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246836 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-run-netns\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246872 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-cni-bin\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-cni-bin\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246853 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246766 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-cni-netd\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.246972 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-run-ovn-kubernetes\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247023 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-run-ovn-kubernetes\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247034 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-log-socket\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247077 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-log-socket\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247087 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-ovn\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247124 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-ovn\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247136 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/81d31a8e-91ef-4e32-b508-13461317fe44-ovn-node-metrics-cert\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247184 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-ovnkube-config\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247220 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-slash\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247294 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-env-overrides\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247328 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-etc-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247362 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-systemd-units\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247395 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-node-log\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247425 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-kubelet\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247488 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-systemd\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247569 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-run-systemd\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247714 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-etc-openvswitch\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247833 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247873 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-slash\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247948 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-node-log\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.247993 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-systemd-units\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.248042 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/81d31a8e-91ef-4e32-b508-13461317fe44-host-kubelet\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.248036 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-ovnkube-script-lib\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.248678 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-env-overrides\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.248682 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/81d31a8e-91ef-4e32-b508-13461317fe44-ovnkube-config\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.253160 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/81d31a8e-91ef-4e32-b508-13461317fe44-ovn-node-metrics-cert\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.280423 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwfs\" (UniqueName: \"kubernetes.io/projected/81d31a8e-91ef-4e32-b508-13461317fe44-kube-api-access-lzwfs\") pod \"ovnkube-node-c9fqh\" (UID: \"81d31a8e-91ef-4e32-b508-13461317fe44\") " pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.288758 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.368017 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovnkube-controller/3.log" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.372119 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovn-acl-logging/0.log" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373013 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g447b_4c749a60-66ac-44d6-955f-a3d050b12758/ovn-controller/0.log" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373772 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" exitCode=0 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373880 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" exitCode=0 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373884 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373960 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373979 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374013 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.373902 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" exitCode=0 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374068 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" exitCode=0 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374049 4797 scope.go:117] "RemoveContainer" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374104 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374151 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374084 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" exitCode=0 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374283 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" exitCode=0 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374304 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" exitCode=143 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374321 4797 generic.go:334] "Generic (PLEG): container finished" podID="4c749a60-66ac-44d6-955f-a3d050b12758" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" exitCode=143 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374795 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374917 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374935 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374954 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374968 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374981 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.374996 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375012 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375026 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375079 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375099 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375113 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.375127 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376777 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376797 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376814 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376827 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376841 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376856 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376888 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376922 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376941 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376956 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376969 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376984 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.376997 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377011 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377025 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377039 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377052 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377073 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g447b" event={"ID":"4c749a60-66ac-44d6-955f-a3d050b12758","Type":"ContainerDied","Data":"7e2e978dc307315b100e524c1dfeee0884affbca1c1bbb630bf16b2e1e1b69c4"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377097 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377113 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377132 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377143 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377154 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377164 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377175 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377186 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377197 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377208 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.377783 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"4c6f8f0cc057b13bef28e9fb25c246731b721413e4fe235cb28923b67775ce95"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.381548 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/2.log" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.382163 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/1.log" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.382246 4797 generic.go:334] "Generic (PLEG): container finished" podID="aba20a5a-9a27-4df1-899d-a107aef7a231" containerID="07b055347ada1770a801c814e8e17d3c951e96d78b4a341ba336bae8089ce020" exitCode=2 Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.382286 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerDied","Data":"07b055347ada1770a801c814e8e17d3c951e96d78b4a341ba336bae8089ce020"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.382311 4797 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f"} Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.383203 4797 scope.go:117] "RemoveContainer" containerID="07b055347ada1770a801c814e8e17d3c951e96d78b4a341ba336bae8089ce020" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.383664 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-w74xm_openshift-multus(aba20a5a-9a27-4df1-899d-a107aef7a231)\"" pod="openshift-multus/multus-w74xm" podUID="aba20a5a-9a27-4df1-899d-a107aef7a231" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.409780 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.441831 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g447b"] Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.451329 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g447b"] Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.457183 4797 scope.go:117] "RemoveContainer" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.474532 4797 scope.go:117] "RemoveContainer" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.496742 4797 scope.go:117] "RemoveContainer" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.519145 4797 scope.go:117] "RemoveContainer" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.540886 4797 scope.go:117] "RemoveContainer" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.572796 4797 scope.go:117] "RemoveContainer" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.626097 4797 scope.go:117] "RemoveContainer" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.648447 4797 scope.go:117] "RemoveContainer" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.662483 4797 scope.go:117] "RemoveContainer" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.663080 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": container with ID starting with ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485 not found: ID does not exist" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.663203 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} err="failed to get container status \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": rpc error: code = NotFound desc = could not find container \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": container with ID starting with ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.663248 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.663649 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": container with ID starting with ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b not found: ID does not exist" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.663694 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} err="failed to get container status \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": rpc error: code = NotFound desc = could not find container \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": container with ID starting with ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.663723 4797 scope.go:117] "RemoveContainer" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.664095 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": container with ID starting with 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd not found: ID does not exist" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.664124 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} err="failed to get container status \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": rpc error: code = NotFound desc = could not find container \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": container with ID starting with 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.664143 4797 scope.go:117] "RemoveContainer" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.664480 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": container with ID starting with 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 not found: ID does not exist" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.664516 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} err="failed to get container status \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": rpc error: code = NotFound desc = could not find container \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": container with ID starting with 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.664535 4797 scope.go:117] "RemoveContainer" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.664791 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": container with ID starting with 918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7 not found: ID does not exist" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.664813 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} err="failed to get container status \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": rpc error: code = NotFound desc = could not find container \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": container with ID starting with 918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.664830 4797 scope.go:117] "RemoveContainer" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.665129 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": container with ID starting with d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9 not found: ID does not exist" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.665197 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} err="failed to get container status \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": rpc error: code = NotFound desc = could not find container \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": container with ID starting with d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.665256 4797 scope.go:117] "RemoveContainer" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.665621 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": container with ID starting with ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0 not found: ID does not exist" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.665656 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} err="failed to get container status \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": rpc error: code = NotFound desc = could not find container \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": container with ID starting with ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.665673 4797 scope.go:117] "RemoveContainer" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.665914 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": container with ID starting with 7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a not found: ID does not exist" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.665937 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} err="failed to get container status \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": rpc error: code = NotFound desc = could not find container \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": container with ID starting with 7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.665955 4797 scope.go:117] "RemoveContainer" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.666235 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": container with ID starting with 04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db not found: ID does not exist" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.666292 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} err="failed to get container status \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": rpc error: code = NotFound desc = could not find container \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": container with ID starting with 04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.666325 4797 scope.go:117] "RemoveContainer" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" Sep 30 17:53:54 crc kubenswrapper[4797]: E0930 17:53:54.666650 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": container with ID starting with 42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762 not found: ID does not exist" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.666684 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} err="failed to get container status \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": rpc error: code = NotFound desc = could not find container \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": container with ID starting with 42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.666702 4797 scope.go:117] "RemoveContainer" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.666971 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} err="failed to get container status \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": rpc error: code = NotFound desc = could not find container \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": container with ID starting with ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.667001 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.667267 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} err="failed to get container status \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": rpc error: code = NotFound desc = could not find container \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": container with ID starting with ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.667286 4797 scope.go:117] "RemoveContainer" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.667795 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} err="failed to get container status \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": rpc error: code = NotFound desc = could not find container \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": container with ID starting with 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.667865 4797 scope.go:117] "RemoveContainer" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.668218 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} err="failed to get container status \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": rpc error: code = NotFound desc = could not find container \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": container with ID starting with 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.668248 4797 scope.go:117] "RemoveContainer" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.668586 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} err="failed to get container status \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": rpc error: code = NotFound desc = could not find container \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": container with ID starting with 918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.668613 4797 scope.go:117] "RemoveContainer" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.668870 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} err="failed to get container status \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": rpc error: code = NotFound desc = could not find container \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": container with ID starting with d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.668906 4797 scope.go:117] "RemoveContainer" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.669200 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} err="failed to get container status \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": rpc error: code = NotFound desc = could not find container \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": container with ID starting with ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.669226 4797 scope.go:117] "RemoveContainer" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.669575 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} err="failed to get container status \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": rpc error: code = NotFound desc = could not find container \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": container with ID starting with 7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.669631 4797 scope.go:117] "RemoveContainer" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.669986 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} err="failed to get container status \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": rpc error: code = NotFound desc = could not find container \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": container with ID starting with 04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.670014 4797 scope.go:117] "RemoveContainer" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.670315 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} err="failed to get container status \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": rpc error: code = NotFound desc = could not find container \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": container with ID starting with 42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.670368 4797 scope.go:117] "RemoveContainer" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.670707 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} err="failed to get container status \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": rpc error: code = NotFound desc = could not find container \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": container with ID starting with ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.670735 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.671016 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} err="failed to get container status \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": rpc error: code = NotFound desc = could not find container \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": container with ID starting with ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.671044 4797 scope.go:117] "RemoveContainer" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.671369 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} err="failed to get container status \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": rpc error: code = NotFound desc = could not find container \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": container with ID starting with 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.671406 4797 scope.go:117] "RemoveContainer" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.671771 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} err="failed to get container status \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": rpc error: code = NotFound desc = could not find container \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": container with ID starting with 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.671812 4797 scope.go:117] "RemoveContainer" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.672084 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} err="failed to get container status \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": rpc error: code = NotFound desc = could not find container \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": container with ID starting with 918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.672113 4797 scope.go:117] "RemoveContainer" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.672644 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} err="failed to get container status \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": rpc error: code = NotFound desc = could not find container \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": container with ID starting with d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.672665 4797 scope.go:117] "RemoveContainer" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.672935 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} err="failed to get container status \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": rpc error: code = NotFound desc = could not find container \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": container with ID starting with ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.672974 4797 scope.go:117] "RemoveContainer" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.673507 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} err="failed to get container status \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": rpc error: code = NotFound desc = could not find container \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": container with ID starting with 7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.673539 4797 scope.go:117] "RemoveContainer" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.673865 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} err="failed to get container status \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": rpc error: code = NotFound desc = could not find container \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": container with ID starting with 04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.673892 4797 scope.go:117] "RemoveContainer" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.674126 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} err="failed to get container status \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": rpc error: code = NotFound desc = could not find container \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": container with ID starting with 42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.674326 4797 scope.go:117] "RemoveContainer" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.674701 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} err="failed to get container status \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": rpc error: code = NotFound desc = could not find container \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": container with ID starting with ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.674735 4797 scope.go:117] "RemoveContainer" containerID="ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.675033 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b"} err="failed to get container status \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": rpc error: code = NotFound desc = could not find container \"ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b\": container with ID starting with ad37a33edd22b7b4cc68ff3871ba03aeec375b09946943804310e6c5fc34146b not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.675066 4797 scope.go:117] "RemoveContainer" containerID="212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.675303 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd"} err="failed to get container status \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": rpc error: code = NotFound desc = could not find container \"212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd\": container with ID starting with 212f715642c84d81f3c192ff35278863c0c6c8c9e47c3bd5a74b818df2ac0cbd not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.675335 4797 scope.go:117] "RemoveContainer" containerID="9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.675742 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924"} err="failed to get container status \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": rpc error: code = NotFound desc = could not find container \"9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924\": container with ID starting with 9804b8617c8ffb00d348db2c3e220375ef35f93add438992131164204c278924 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.675767 4797 scope.go:117] "RemoveContainer" containerID="918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.676241 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7"} err="failed to get container status \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": rpc error: code = NotFound desc = could not find container \"918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7\": container with ID starting with 918e8d4c544f186987363c48c911fcbe73ed5edeabadec283457e5fd35b92ca7 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.676298 4797 scope.go:117] "RemoveContainer" containerID="d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.676828 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9"} err="failed to get container status \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": rpc error: code = NotFound desc = could not find container \"d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9\": container with ID starting with d068512b61ea1d946d69c9b9b49767be1e9330da1c7a14d7cabee6a1d4ac6cf9 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.676863 4797 scope.go:117] "RemoveContainer" containerID="ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.677314 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0"} err="failed to get container status \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": rpc error: code = NotFound desc = could not find container \"ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0\": container with ID starting with ba577c47bee0242be7d0c0fa4f9131e36fc7efe5d9df38a4f4a83b82ac89bac0 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.677338 4797 scope.go:117] "RemoveContainer" containerID="7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.677732 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a"} err="failed to get container status \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": rpc error: code = NotFound desc = could not find container \"7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a\": container with ID starting with 7ba8df1dbb64e1371e7e9bf88517bb8092fdaaaf29c329f0032497647d7bd17a not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.677782 4797 scope.go:117] "RemoveContainer" containerID="04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.678949 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db"} err="failed to get container status \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": rpc error: code = NotFound desc = could not find container \"04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db\": container with ID starting with 04f959f24f4d3fb8c6c1c26c7ed172028d13649bed1fdd366b452816c10e54db not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.678993 4797 scope.go:117] "RemoveContainer" containerID="42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.679398 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762"} err="failed to get container status \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": rpc error: code = NotFound desc = could not find container \"42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762\": container with ID starting with 42201ea658aacdc0d3bd24c6c4495ad1ba16b48a29b6fce782926f8be6b92762 not found: ID does not exist" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.679465 4797 scope.go:117] "RemoveContainer" containerID="ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485" Sep 30 17:53:54 crc kubenswrapper[4797]: I0930 17:53:54.679992 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485"} err="failed to get container status \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": rpc error: code = NotFound desc = could not find container \"ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485\": container with ID starting with ac08cadb1dbba1c4b4e14591d32643934d1c5d2268adca82e37256a395e55485 not found: ID does not exist" Sep 30 17:53:55 crc kubenswrapper[4797]: I0930 17:53:55.393197 4797 generic.go:334] "Generic (PLEG): container finished" podID="81d31a8e-91ef-4e32-b508-13461317fe44" containerID="df496316d1501ef23f5db366aac45fc05e6af488e23b7ae509c54d58ab0f1c36" exitCode=0 Sep 30 17:53:55 crc kubenswrapper[4797]: I0930 17:53:55.393303 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerDied","Data":"df496316d1501ef23f5db366aac45fc05e6af488e23b7ae509c54d58ab0f1c36"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.256815 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c749a60-66ac-44d6-955f-a3d050b12758" path="/var/lib/kubelet/pods/4c749a60-66ac-44d6-955f-a3d050b12758/volumes" Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.403779 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"51c8f583c47b36aa7c0950335848b64c6eed255e72270afcb46f5b1abeb310dd"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.403847 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"f8993ec40ef597508a298349fed4e42668daa736807d3252b8d64bf528bbee24"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.403862 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"737a4a049467d01ffd0250e74a1f06bf363d1ede30ebd416ec345e8ed92336b2"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.403897 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"377de12749d6ba6454db19d7b70c4a965c41fa3bd74a21d581872eff65b3b5f1"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.403913 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"a326f06c5adb699e890b0fbfa5eb051a8d73805fbff2b4a4da24ccd3a33bf445"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.403929 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"546d6e18aaa6c0216e3d9977077af083255b1aad9f7795018be7a0266c7d218d"} Sep 30 17:53:56 crc kubenswrapper[4797]: I0930 17:53:56.885194 4797 scope.go:117] "RemoveContainer" containerID="80a74d2505e0fcba5a50c5f3e545cf1ab75b34485bf174820fab366a06e63c1f" Sep 30 17:53:57 crc kubenswrapper[4797]: I0930 17:53:57.414312 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/2.log" Sep 30 17:53:59 crc kubenswrapper[4797]: I0930 17:53:59.434901 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"ffd03a97b82640821f06626f0af4e9ce6508dde771e389fd0fa92e7d55fa37b2"} Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.456786 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" event={"ID":"81d31a8e-91ef-4e32-b508-13461317fe44","Type":"ContainerStarted","Data":"f84811f7c95814d0f5360e8e8511b4132b0ec51c447b75995aeaa7dc628b587f"} Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.457701 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.457739 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.457762 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.495770 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" podStartSLOduration=8.495753833 podStartE2EDuration="8.495753833s" podCreationTimestamp="2025-09-30 17:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:54:01.492581138 +0000 UTC m=+692.015080446" watchObservedRunningTime="2025-09-30 17:54:01.495753833 +0000 UTC m=+692.018253071" Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.508324 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:54:01 crc kubenswrapper[4797]: I0930 17:54:01.512917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:54:09 crc kubenswrapper[4797]: I0930 17:54:09.237871 4797 scope.go:117] "RemoveContainer" containerID="07b055347ada1770a801c814e8e17d3c951e96d78b4a341ba336bae8089ce020" Sep 30 17:54:09 crc kubenswrapper[4797]: E0930 17:54:09.238634 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-w74xm_openshift-multus(aba20a5a-9a27-4df1-899d-a107aef7a231)\"" pod="openshift-multus/multus-w74xm" podUID="aba20a5a-9a27-4df1-899d-a107aef7a231" Sep 30 17:54:14 crc kubenswrapper[4797]: I0930 17:54:14.996590 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l"] Sep 30 17:54:14 crc kubenswrapper[4797]: I0930 17:54:14.997913 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.000726 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.025011 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l"] Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.143553 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.143638 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9lrm\" (UniqueName: \"kubernetes.io/projected/55addd78-6666-44ec-9bb4-56a24edfbc41-kube-api-access-n9lrm\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.143672 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.245054 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.245156 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9lrm\" (UniqueName: \"kubernetes.io/projected/55addd78-6666-44ec-9bb4-56a24edfbc41-kube-api-access-n9lrm\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.245208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.245890 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.245942 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.280520 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9lrm\" (UniqueName: \"kubernetes.io/projected/55addd78-6666-44ec-9bb4-56a24edfbc41-kube-api-access-n9lrm\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.324826 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.365381 4797 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(b7659df7eba4413df1bf7d21b6382bb114026c7293c8874798c898f5b52c02ad): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.365550 4797 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(b7659df7eba4413df1bf7d21b6382bb114026c7293c8874798c898f5b52c02ad): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.365591 4797 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(b7659df7eba4413df1bf7d21b6382bb114026c7293c8874798c898f5b52c02ad): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.365671 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace(55addd78-6666-44ec-9bb4-56a24edfbc41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace(55addd78-6666-44ec-9bb4-56a24edfbc41)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(b7659df7eba4413df1bf7d21b6382bb114026c7293c8874798c898f5b52c02ad): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.550890 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: I0930 17:54:15.551659 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.594944 4797 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(2e4d8f1a04721a6bc259ca90f52d7e760f3934e16d09a84253860cf2c89326aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.595083 4797 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(2e4d8f1a04721a6bc259ca90f52d7e760f3934e16d09a84253860cf2c89326aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.595135 4797 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(2e4d8f1a04721a6bc259ca90f52d7e760f3934e16d09a84253860cf2c89326aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:15 crc kubenswrapper[4797]: E0930 17:54:15.595230 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace(55addd78-6666-44ec-9bb4-56a24edfbc41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace(55addd78-6666-44ec-9bb4-56a24edfbc41)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_openshift-marketplace_55addd78-6666-44ec-9bb4-56a24edfbc41_0(2e4d8f1a04721a6bc259ca90f52d7e760f3934e16d09a84253860cf2c89326aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" Sep 30 17:54:23 crc kubenswrapper[4797]: I0930 17:54:23.238818 4797 scope.go:117] "RemoveContainer" containerID="07b055347ada1770a801c814e8e17d3c951e96d78b4a341ba336bae8089ce020" Sep 30 17:54:23 crc kubenswrapper[4797]: I0930 17:54:23.605080 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w74xm_aba20a5a-9a27-4df1-899d-a107aef7a231/kube-multus/2.log" Sep 30 17:54:23 crc kubenswrapper[4797]: I0930 17:54:23.605463 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w74xm" event={"ID":"aba20a5a-9a27-4df1-899d-a107aef7a231","Type":"ContainerStarted","Data":"679650e0999c3b2e4fc3b511a2b2aa177b33057136db6d6c954a863a20108af2"} Sep 30 17:54:24 crc kubenswrapper[4797]: I0930 17:54:24.317671 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c9fqh" Sep 30 17:54:28 crc kubenswrapper[4797]: I0930 17:54:28.237592 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:28 crc kubenswrapper[4797]: I0930 17:54:28.238927 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:28 crc kubenswrapper[4797]: I0930 17:54:28.472796 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l"] Sep 30 17:54:28 crc kubenswrapper[4797]: W0930 17:54:28.488560 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55addd78_6666_44ec_9bb4_56a24edfbc41.slice/crio-0c5724b210aff7cf71321b7c457296691cc8abd6fa269160e6f8469a9c122f47 WatchSource:0}: Error finding container 0c5724b210aff7cf71321b7c457296691cc8abd6fa269160e6f8469a9c122f47: Status 404 returned error can't find the container with id 0c5724b210aff7cf71321b7c457296691cc8abd6fa269160e6f8469a9c122f47 Sep 30 17:54:28 crc kubenswrapper[4797]: I0930 17:54:28.649265 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" event={"ID":"55addd78-6666-44ec-9bb4-56a24edfbc41","Type":"ContainerStarted","Data":"7d2ce7499838fad74f7dcf7f5c5ba43467d480689fbb1f3ea92806464f0647ce"} Sep 30 17:54:28 crc kubenswrapper[4797]: I0930 17:54:28.649331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" event={"ID":"55addd78-6666-44ec-9bb4-56a24edfbc41","Type":"ContainerStarted","Data":"0c5724b210aff7cf71321b7c457296691cc8abd6fa269160e6f8469a9c122f47"} Sep 30 17:54:30 crc kubenswrapper[4797]: I0930 17:54:30.661863 4797 generic.go:334] "Generic (PLEG): container finished" podID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerID="7d2ce7499838fad74f7dcf7f5c5ba43467d480689fbb1f3ea92806464f0647ce" exitCode=0 Sep 30 17:54:30 crc kubenswrapper[4797]: I0930 17:54:30.661966 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" event={"ID":"55addd78-6666-44ec-9bb4-56a24edfbc41","Type":"ContainerDied","Data":"7d2ce7499838fad74f7dcf7f5c5ba43467d480689fbb1f3ea92806464f0647ce"} Sep 30 17:54:32 crc kubenswrapper[4797]: I0930 17:54:32.677489 4797 generic.go:334] "Generic (PLEG): container finished" podID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerID="34bdde23505509a86f1cc1d62a65328c4630b3b219dbe4469d57adc191fc45f9" exitCode=0 Sep 30 17:54:32 crc kubenswrapper[4797]: I0930 17:54:32.677610 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" event={"ID":"55addd78-6666-44ec-9bb4-56a24edfbc41","Type":"ContainerDied","Data":"34bdde23505509a86f1cc1d62a65328c4630b3b219dbe4469d57adc191fc45f9"} Sep 30 17:54:33 crc kubenswrapper[4797]: I0930 17:54:33.691991 4797 generic.go:334] "Generic (PLEG): container finished" podID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerID="79bff330e2f1c1e702fce0721006292b5c56d9ae0052c5c20ddca56e08946ae7" exitCode=0 Sep 30 17:54:33 crc kubenswrapper[4797]: I0930 17:54:33.692136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" event={"ID":"55addd78-6666-44ec-9bb4-56a24edfbc41","Type":"ContainerDied","Data":"79bff330e2f1c1e702fce0721006292b5c56d9ae0052c5c20ddca56e08946ae7"} Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.036687 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.147389 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-util\") pod \"55addd78-6666-44ec-9bb4-56a24edfbc41\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.147513 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9lrm\" (UniqueName: \"kubernetes.io/projected/55addd78-6666-44ec-9bb4-56a24edfbc41-kube-api-access-n9lrm\") pod \"55addd78-6666-44ec-9bb4-56a24edfbc41\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.147580 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-bundle\") pod \"55addd78-6666-44ec-9bb4-56a24edfbc41\" (UID: \"55addd78-6666-44ec-9bb4-56a24edfbc41\") " Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.150169 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-bundle" (OuterVolumeSpecName: "bundle") pod "55addd78-6666-44ec-9bb4-56a24edfbc41" (UID: "55addd78-6666-44ec-9bb4-56a24edfbc41"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.157216 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55addd78-6666-44ec-9bb4-56a24edfbc41-kube-api-access-n9lrm" (OuterVolumeSpecName: "kube-api-access-n9lrm") pod "55addd78-6666-44ec-9bb4-56a24edfbc41" (UID: "55addd78-6666-44ec-9bb4-56a24edfbc41"). InnerVolumeSpecName "kube-api-access-n9lrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.170350 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-util" (OuterVolumeSpecName: "util") pod "55addd78-6666-44ec-9bb4-56a24edfbc41" (UID: "55addd78-6666-44ec-9bb4-56a24edfbc41"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.249393 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.249497 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9lrm\" (UniqueName: \"kubernetes.io/projected/55addd78-6666-44ec-9bb4-56a24edfbc41-kube-api-access-n9lrm\") on node \"crc\" DevicePath \"\"" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.249513 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55addd78-6666-44ec-9bb4-56a24edfbc41-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.712731 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" event={"ID":"55addd78-6666-44ec-9bb4-56a24edfbc41","Type":"ContainerDied","Data":"0c5724b210aff7cf71321b7c457296691cc8abd6fa269160e6f8469a9c122f47"} Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.712819 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c5724b210aff7cf71321b7c457296691cc8abd6fa269160e6f8469a9c122f47" Sep 30 17:54:35 crc kubenswrapper[4797]: I0930 17:54:35.712762 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l" Sep 30 17:54:44 crc kubenswrapper[4797]: I0930 17:54:44.191723 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:54:44 crc kubenswrapper[4797]: I0930 17:54:44.191994 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.614706 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl"] Sep 30 17:54:47 crc kubenswrapper[4797]: E0930 17:54:47.616310 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="util" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.616409 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="util" Sep 30 17:54:47 crc kubenswrapper[4797]: E0930 17:54:47.616550 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="pull" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.616635 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="pull" Sep 30 17:54:47 crc kubenswrapper[4797]: E0930 17:54:47.616713 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="extract" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.616783 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="extract" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.616978 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="55addd78-6666-44ec-9bb4-56a24edfbc41" containerName="extract" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.617519 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.619576 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-b2d4t" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.620327 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.624853 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.639785 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.707113 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4rh\" (UniqueName: \"kubernetes.io/projected/e3897355-dd17-4112-94fd-42c45c4cfa7f-kube-api-access-4r4rh\") pod \"obo-prometheus-operator-7c8cf85677-4rpbl\" (UID: \"e3897355-dd17-4112-94fd-42c45c4cfa7f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.736878 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.737504 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.741969 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.742498 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-9jt57" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.748976 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.749630 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.761789 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.780525 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.808587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r4rh\" (UniqueName: \"kubernetes.io/projected/e3897355-dd17-4112-94fd-42c45c4cfa7f-kube-api-access-4r4rh\") pod \"obo-prometheus-operator-7c8cf85677-4rpbl\" (UID: \"e3897355-dd17-4112-94fd-42c45c4cfa7f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.808678 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d153021c-fc00-4c10-91dd-69c13423dd4d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh\" (UID: \"d153021c-fc00-4c10-91dd-69c13423dd4d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.808703 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ea30b0e-2f00-44a1-8e46-cc36ffc843a5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4\" (UID: \"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.808720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d153021c-fc00-4c10-91dd-69c13423dd4d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh\" (UID: \"d153021c-fc00-4c10-91dd-69c13423dd4d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.808748 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ea30b0e-2f00-44a1-8e46-cc36ffc843a5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4\" (UID: \"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.835298 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r4rh\" (UniqueName: \"kubernetes.io/projected/e3897355-dd17-4112-94fd-42c45c4cfa7f-kube-api-access-4r4rh\") pod \"obo-prometheus-operator-7c8cf85677-4rpbl\" (UID: \"e3897355-dd17-4112-94fd-42c45c4cfa7f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.871552 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-nvvj2"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.872188 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.875354 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-p2lll" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.875613 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.898601 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-nvvj2"] Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.909796 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d153021c-fc00-4c10-91dd-69c13423dd4d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh\" (UID: \"d153021c-fc00-4c10-91dd-69c13423dd4d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.909845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ea30b0e-2f00-44a1-8e46-cc36ffc843a5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4\" (UID: \"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.909873 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d153021c-fc00-4c10-91dd-69c13423dd4d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh\" (UID: \"d153021c-fc00-4c10-91dd-69c13423dd4d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.909913 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ea30b0e-2f00-44a1-8e46-cc36ffc843a5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4\" (UID: \"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.915484 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d153021c-fc00-4c10-91dd-69c13423dd4d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh\" (UID: \"d153021c-fc00-4c10-91dd-69c13423dd4d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.915716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ea30b0e-2f00-44a1-8e46-cc36ffc843a5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4\" (UID: \"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.916867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d153021c-fc00-4c10-91dd-69c13423dd4d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh\" (UID: \"d153021c-fc00-4c10-91dd-69c13423dd4d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.918000 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ea30b0e-2f00-44a1-8e46-cc36ffc843a5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4\" (UID: \"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:47 crc kubenswrapper[4797]: I0930 17:54:47.933787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.028268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768m6\" (UniqueName: \"kubernetes.io/projected/f3d6451a-ed07-4fc4-9ebe-a8c8d514999c-kube-api-access-768m6\") pod \"observability-operator-cc5f78dfc-nvvj2\" (UID: \"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c\") " pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.028568 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3d6451a-ed07-4fc4-9ebe-a8c8d514999c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-nvvj2\" (UID: \"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c\") " pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.059669 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.067755 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.071174 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-r4wwh"] Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.071875 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.075188 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6ptmd" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.078856 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-r4wwh"] Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.142911 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hcxr\" (UniqueName: \"kubernetes.io/projected/91658bbd-3b03-40ce-af08-985444e42376-kube-api-access-4hcxr\") pod \"perses-operator-54bc95c9fb-r4wwh\" (UID: \"91658bbd-3b03-40ce-af08-985444e42376\") " pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.142957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/91658bbd-3b03-40ce-af08-985444e42376-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-r4wwh\" (UID: \"91658bbd-3b03-40ce-af08-985444e42376\") " pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.142990 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768m6\" (UniqueName: \"kubernetes.io/projected/f3d6451a-ed07-4fc4-9ebe-a8c8d514999c-kube-api-access-768m6\") pod \"observability-operator-cc5f78dfc-nvvj2\" (UID: \"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c\") " pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.143016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3d6451a-ed07-4fc4-9ebe-a8c8d514999c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-nvvj2\" (UID: \"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c\") " pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.153276 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3d6451a-ed07-4fc4-9ebe-a8c8d514999c-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-nvvj2\" (UID: \"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c\") " pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.165939 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768m6\" (UniqueName: \"kubernetes.io/projected/f3d6451a-ed07-4fc4-9ebe-a8c8d514999c-kube-api-access-768m6\") pod \"observability-operator-cc5f78dfc-nvvj2\" (UID: \"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c\") " pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.192874 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.244373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hcxr\" (UniqueName: \"kubernetes.io/projected/91658bbd-3b03-40ce-af08-985444e42376-kube-api-access-4hcxr\") pod \"perses-operator-54bc95c9fb-r4wwh\" (UID: \"91658bbd-3b03-40ce-af08-985444e42376\") " pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.244712 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/91658bbd-3b03-40ce-af08-985444e42376-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-r4wwh\" (UID: \"91658bbd-3b03-40ce-af08-985444e42376\") " pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.245425 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/91658bbd-3b03-40ce-af08-985444e42376-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-r4wwh\" (UID: \"91658bbd-3b03-40ce-af08-985444e42376\") " pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.267423 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hcxr\" (UniqueName: \"kubernetes.io/projected/91658bbd-3b03-40ce-af08-985444e42376-kube-api-access-4hcxr\") pod \"perses-operator-54bc95c9fb-r4wwh\" (UID: \"91658bbd-3b03-40ce-af08-985444e42376\") " pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.304029 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl"] Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.330686 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh"] Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.403465 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.437511 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-nvvj2"] Sep 30 17:54:48 crc kubenswrapper[4797]: W0930 17:54:48.462934 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3d6451a_ed07_4fc4_9ebe_a8c8d514999c.slice/crio-ed4fb1e8b614196942e37899ab7e0ca0a5a35bb22b56f31f68dda5f382107132 WatchSource:0}: Error finding container ed4fb1e8b614196942e37899ab7e0ca0a5a35bb22b56f31f68dda5f382107132: Status 404 returned error can't find the container with id ed4fb1e8b614196942e37899ab7e0ca0a5a35bb22b56f31f68dda5f382107132 Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.577415 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4"] Sep 30 17:54:48 crc kubenswrapper[4797]: W0930 17:54:48.592249 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea30b0e_2f00_44a1_8e46_cc36ffc843a5.slice/crio-c72cbd67c3e6348e0964d0b869f354b46305322680be3546cf6df852460416f0 WatchSource:0}: Error finding container c72cbd67c3e6348e0964d0b869f354b46305322680be3546cf6df852460416f0: Status 404 returned error can't find the container with id c72cbd67c3e6348e0964d0b869f354b46305322680be3546cf6df852460416f0 Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.654888 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-r4wwh"] Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.795745 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" event={"ID":"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5","Type":"ContainerStarted","Data":"c72cbd67c3e6348e0964d0b869f354b46305322680be3546cf6df852460416f0"} Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.797277 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" event={"ID":"e3897355-dd17-4112-94fd-42c45c4cfa7f","Type":"ContainerStarted","Data":"9a83185181c8330e7c511d44cb1ef31c2a82798ab1698c1f17204bd10659cbc5"} Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.798627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" event={"ID":"d153021c-fc00-4c10-91dd-69c13423dd4d","Type":"ContainerStarted","Data":"b5dfc220072b85337b77f8100306c1b554c91a6e168503bb7ac0fa54e189264a"} Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.799756 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" event={"ID":"91658bbd-3b03-40ce-af08-985444e42376","Type":"ContainerStarted","Data":"522b9711ff121a22c2a65884e20a9b3ed922924e704c4d20859a70faf30d061e"} Sep 30 17:54:48 crc kubenswrapper[4797]: I0930 17:54:48.801106 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" event={"ID":"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c","Type":"ContainerStarted","Data":"ed4fb1e8b614196942e37899ab7e0ca0a5a35bb22b56f31f68dda5f382107132"} Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.524069 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jgd8l"] Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.524782 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerName="controller-manager" containerID="cri-o://52d2376499e3173decc31e80bf630b5e077ed09ca763b5ae35de6ec56008e5f8" gracePeriod=30 Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.540401 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq"] Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.540677 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" podUID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" containerName="route-controller-manager" containerID="cri-o://a8a009e20380cba9977a3ac9419b6a06930a6a3decc517dea6c298d42694ebe8" gracePeriod=30 Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.899836 4797 generic.go:334] "Generic (PLEG): container finished" podID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" containerID="a8a009e20380cba9977a3ac9419b6a06930a6a3decc517dea6c298d42694ebe8" exitCode=0 Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.899924 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" event={"ID":"8d1ebf89-17d5-4811-9ee9-05d7e50441e9","Type":"ContainerDied","Data":"a8a009e20380cba9977a3ac9419b6a06930a6a3decc517dea6c298d42694ebe8"} Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.901721 4797 generic.go:334] "Generic (PLEG): container finished" podID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerID="52d2376499e3173decc31e80bf630b5e077ed09ca763b5ae35de6ec56008e5f8" exitCode=0 Sep 30 17:55:00 crc kubenswrapper[4797]: I0930 17:55:00.901808 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" event={"ID":"0dd5e175-89ae-4e6e-9134-56d10a1974c5","Type":"ContainerDied","Data":"52d2376499e3173decc31e80bf630b5e077ed09ca763b5ae35de6ec56008e5f8"} Sep 30 17:55:01 crc kubenswrapper[4797]: I0930 17:55:01.910077 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" event={"ID":"0dd5e175-89ae-4e6e-9134-56d10a1974c5","Type":"ContainerDied","Data":"ddba231749194a04d4e9dba3909e49ce5721e1cbb7c966d3d4d679358dfa23ba"} Sep 30 17:55:01 crc kubenswrapper[4797]: I0930 17:55:01.910454 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddba231749194a04d4e9dba3909e49ce5721e1cbb7c966d3d4d679358dfa23ba" Sep 30 17:55:01 crc kubenswrapper[4797]: I0930 17:55:01.944299 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.021460 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57f87c44b4-kf48m"] Sep 30 17:55:02 crc kubenswrapper[4797]: E0930 17:55:02.021643 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerName="controller-manager" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.021654 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerName="controller-manager" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.022271 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" containerName="controller-manager" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.022674 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.044931 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f87c44b4-kf48m"] Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.052086 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-proxy-ca-bundles\") pod \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.052143 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp6n8\" (UniqueName: \"kubernetes.io/projected/0dd5e175-89ae-4e6e-9134-56d10a1974c5-kube-api-access-zp6n8\") pod \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.052189 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-config\") pod \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.052221 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-client-ca\") pod \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.052240 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dd5e175-89ae-4e6e-9134-56d10a1974c5-serving-cert\") pod \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\" (UID: \"0dd5e175-89ae-4e6e-9134-56d10a1974c5\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.052857 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0dd5e175-89ae-4e6e-9134-56d10a1974c5" (UID: "0dd5e175-89ae-4e6e-9134-56d10a1974c5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.053371 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-config" (OuterVolumeSpecName: "config") pod "0dd5e175-89ae-4e6e-9134-56d10a1974c5" (UID: "0dd5e175-89ae-4e6e-9134-56d10a1974c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.061572 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-client-ca" (OuterVolumeSpecName: "client-ca") pod "0dd5e175-89ae-4e6e-9134-56d10a1974c5" (UID: "0dd5e175-89ae-4e6e-9134-56d10a1974c5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.079214 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd5e175-89ae-4e6e-9134-56d10a1974c5-kube-api-access-zp6n8" (OuterVolumeSpecName: "kube-api-access-zp6n8") pod "0dd5e175-89ae-4e6e-9134-56d10a1974c5" (UID: "0dd5e175-89ae-4e6e-9134-56d10a1974c5"). InnerVolumeSpecName "kube-api-access-zp6n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.079587 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5e175-89ae-4e6e-9134-56d10a1974c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0dd5e175-89ae-4e6e-9134-56d10a1974c5" (UID: "0dd5e175-89ae-4e6e-9134-56d10a1974c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153334 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-client-ca\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153374 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6d1982-3f25-41dd-930c-6c589c6400a5-serving-cert\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153410 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-config\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153443 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nvh\" (UniqueName: \"kubernetes.io/projected/ac6d1982-3f25-41dd-930c-6c589c6400a5-kube-api-access-76nvh\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153526 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-proxy-ca-bundles\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153569 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dd5e175-89ae-4e6e-9134-56d10a1974c5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153579 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153589 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp6n8\" (UniqueName: \"kubernetes.io/projected/0dd5e175-89ae-4e6e-9134-56d10a1974c5-kube-api-access-zp6n8\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153597 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.153605 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dd5e175-89ae-4e6e-9134-56d10a1974c5-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.154151 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57f87c44b4-kf48m"] Sep 30 17:55:02 crc kubenswrapper[4797]: E0930 17:55:02.154635 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-76nvh proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" podUID="ac6d1982-3f25-41dd-930c-6c589c6400a5" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.254233 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-config\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.254275 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76nvh\" (UniqueName: \"kubernetes.io/projected/ac6d1982-3f25-41dd-930c-6c589c6400a5-kube-api-access-76nvh\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.254316 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-proxy-ca-bundles\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.254353 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-client-ca\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.254375 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6d1982-3f25-41dd-930c-6c589c6400a5-serving-cert\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.256525 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-config\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.256888 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-proxy-ca-bundles\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.257180 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-client-ca\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.262311 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6d1982-3f25-41dd-930c-6c589c6400a5-serving-cert\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.319869 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nvh\" (UniqueName: \"kubernetes.io/projected/ac6d1982-3f25-41dd-930c-6c589c6400a5-kube-api-access-76nvh\") pod \"controller-manager-57f87c44b4-kf48m\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.358087 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.455909 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-config\") pod \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.456199 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-client-ca\") pod \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.456249 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hljd\" (UniqueName: \"kubernetes.io/projected/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-kube-api-access-7hljd\") pod \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.456298 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-serving-cert\") pod \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\" (UID: \"8d1ebf89-17d5-4811-9ee9-05d7e50441e9\") " Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.456759 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-config" (OuterVolumeSpecName: "config") pod "8d1ebf89-17d5-4811-9ee9-05d7e50441e9" (UID: "8d1ebf89-17d5-4811-9ee9-05d7e50441e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.456998 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d1ebf89-17d5-4811-9ee9-05d7e50441e9" (UID: "8d1ebf89-17d5-4811-9ee9-05d7e50441e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.472766 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-kube-api-access-7hljd" (OuterVolumeSpecName: "kube-api-access-7hljd") pod "8d1ebf89-17d5-4811-9ee9-05d7e50441e9" (UID: "8d1ebf89-17d5-4811-9ee9-05d7e50441e9"). InnerVolumeSpecName "kube-api-access-7hljd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.473225 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d1ebf89-17d5-4811-9ee9-05d7e50441e9" (UID: "8d1ebf89-17d5-4811-9ee9-05d7e50441e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.557446 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.557690 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.557757 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hljd\" (UniqueName: \"kubernetes.io/projected/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-kube-api-access-7hljd\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.557809 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1ebf89-17d5-4811-9ee9-05d7e50441e9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.917957 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" event={"ID":"d153021c-fc00-4c10-91dd-69c13423dd4d","Type":"ContainerStarted","Data":"e8e54e7861f81c9447f744d65966ec5cbc79b1a50b305d7b2d08352fe186c59d"} Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.921910 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" event={"ID":"91658bbd-3b03-40ce-af08-985444e42376","Type":"ContainerStarted","Data":"c57778dd04dd4c70e559dcf251d09267e07de015d02204d8686ec3924781d631"} Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.922584 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.924415 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" event={"ID":"f3d6451a-ed07-4fc4-9ebe-a8c8d514999c","Type":"ContainerStarted","Data":"4f730422720ae8bd5c6143a81c9b26cac66d72734d2e371e48b6e6efdeb393b0"} Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.924756 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.927016 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.927031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq" event={"ID":"8d1ebf89-17d5-4811-9ee9-05d7e50441e9","Type":"ContainerDied","Data":"eead655a5922a5adab995cd3006e8a42bc185cea8d09bbdf1797e8c1a0acce6f"} Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.927158 4797 scope.go:117] "RemoveContainer" containerID="a8a009e20380cba9977a3ac9419b6a06930a6a3decc517dea6c298d42694ebe8" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.936834 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.937319 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" event={"ID":"9ea30b0e-2f00-44a1-8e46-cc36ffc843a5","Type":"ContainerStarted","Data":"da248cd3d20d0ed189760324e7d3ca70dad24b76673d579864c4e0cd7c216ef0"} Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.937380 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jgd8l" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.951834 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh" podStartSLOduration=2.359441743 podStartE2EDuration="15.951787711s" podCreationTimestamp="2025-09-30 17:54:47 +0000 UTC" firstStartedPulling="2025-09-30 17:54:48.349517147 +0000 UTC m=+738.872016385" lastFinishedPulling="2025-09-30 17:55:01.941863115 +0000 UTC m=+752.464362353" observedRunningTime="2025-09-30 17:55:02.943789214 +0000 UTC m=+753.466288482" watchObservedRunningTime="2025-09-30 17:55:02.951787711 +0000 UTC m=+753.474286959" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.967581 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.977322 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" Sep 30 17:55:02 crc kubenswrapper[4797]: I0930 17:55:02.978360 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" podStartSLOduration=1.673670302 podStartE2EDuration="14.978341268s" podCreationTimestamp="2025-09-30 17:54:48 +0000 UTC" firstStartedPulling="2025-09-30 17:54:48.67494407 +0000 UTC m=+739.197443308" lastFinishedPulling="2025-09-30 17:55:01.979615036 +0000 UTC m=+752.502114274" observedRunningTime="2025-09-30 17:55:02.977559986 +0000 UTC m=+753.500059234" watchObservedRunningTime="2025-09-30 17:55:02.978341268 +0000 UTC m=+753.500840516" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.012193 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4" podStartSLOduration=2.6606810850000002 podStartE2EDuration="16.012174692s" podCreationTimestamp="2025-09-30 17:54:47 +0000 UTC" firstStartedPulling="2025-09-30 17:54:48.594678356 +0000 UTC m=+739.117177604" lastFinishedPulling="2025-09-30 17:55:01.946171973 +0000 UTC m=+752.468671211" observedRunningTime="2025-09-30 17:55:03.008002067 +0000 UTC m=+753.530501315" watchObservedRunningTime="2025-09-30 17:55:03.012174692 +0000 UTC m=+753.534673930" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.036167 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-nvvj2" podStartSLOduration=2.558196575 podStartE2EDuration="16.036145727s" podCreationTimestamp="2025-09-30 17:54:47 +0000 UTC" firstStartedPulling="2025-09-30 17:54:48.465507696 +0000 UTC m=+738.988006934" lastFinishedPulling="2025-09-30 17:55:01.943456848 +0000 UTC m=+752.465956086" observedRunningTime="2025-09-30 17:55:03.033201446 +0000 UTC m=+753.555700714" watchObservedRunningTime="2025-09-30 17:55:03.036145727 +0000 UTC m=+753.558644965" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.057912 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jgd8l"] Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.060650 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jgd8l"] Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.063183 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6d1982-3f25-41dd-930c-6c589c6400a5-serving-cert\") pod \"ac6d1982-3f25-41dd-930c-6c589c6400a5\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.063232 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-client-ca\") pod \"ac6d1982-3f25-41dd-930c-6c589c6400a5\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.063309 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-config\") pod \"ac6d1982-3f25-41dd-930c-6c589c6400a5\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.063338 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76nvh\" (UniqueName: \"kubernetes.io/projected/ac6d1982-3f25-41dd-930c-6c589c6400a5-kube-api-access-76nvh\") pod \"ac6d1982-3f25-41dd-930c-6c589c6400a5\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.063389 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-proxy-ca-bundles\") pod \"ac6d1982-3f25-41dd-930c-6c589c6400a5\" (UID: \"ac6d1982-3f25-41dd-930c-6c589c6400a5\") " Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.063990 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-config" (OuterVolumeSpecName: "config") pod "ac6d1982-3f25-41dd-930c-6c589c6400a5" (UID: "ac6d1982-3f25-41dd-930c-6c589c6400a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.064029 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ac6d1982-3f25-41dd-930c-6c589c6400a5" (UID: "ac6d1982-3f25-41dd-930c-6c589c6400a5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.064281 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac6d1982-3f25-41dd-930c-6c589c6400a5" (UID: "ac6d1982-3f25-41dd-930c-6c589c6400a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.066900 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6d1982-3f25-41dd-930c-6c589c6400a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac6d1982-3f25-41dd-930c-6c589c6400a5" (UID: "ac6d1982-3f25-41dd-930c-6c589c6400a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.069636 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6d1982-3f25-41dd-930c-6c589c6400a5-kube-api-access-76nvh" (OuterVolumeSpecName: "kube-api-access-76nvh") pod "ac6d1982-3f25-41dd-930c-6c589c6400a5" (UID: "ac6d1982-3f25-41dd-930c-6c589c6400a5"). InnerVolumeSpecName "kube-api-access-76nvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.127601 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq"] Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.130770 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2s4jq"] Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.164651 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.164682 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76nvh\" (UniqueName: \"kubernetes.io/projected/ac6d1982-3f25-41dd-930c-6c589c6400a5-kube-api-access-76nvh\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.164694 4797 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.164702 4797 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6d1982-3f25-41dd-930c-6c589c6400a5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.164711 4797 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac6d1982-3f25-41dd-930c-6c589c6400a5-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.549060 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj"] Sep 30 17:55:03 crc kubenswrapper[4797]: E0930 17:55:03.549264 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" containerName="route-controller-manager" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.549276 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" containerName="route-controller-manager" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.549384 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" containerName="route-controller-manager" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.549761 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.554613 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.554920 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.555270 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.556188 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.556402 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.556765 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.569086 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06324904-81ca-4e85-8251-21625e7a904e-serving-cert\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.569168 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llb4f\" (UniqueName: \"kubernetes.io/projected/06324904-81ca-4e85-8251-21625e7a904e-kube-api-access-llb4f\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.569197 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06324904-81ca-4e85-8251-21625e7a904e-config\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.569226 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06324904-81ca-4e85-8251-21625e7a904e-client-ca\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.571169 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj"] Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.670561 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llb4f\" (UniqueName: \"kubernetes.io/projected/06324904-81ca-4e85-8251-21625e7a904e-kube-api-access-llb4f\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.670602 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06324904-81ca-4e85-8251-21625e7a904e-config\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.670631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06324904-81ca-4e85-8251-21625e7a904e-client-ca\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.670673 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06324904-81ca-4e85-8251-21625e7a904e-serving-cert\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.671845 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06324904-81ca-4e85-8251-21625e7a904e-client-ca\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.672053 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06324904-81ca-4e85-8251-21625e7a904e-config\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.678011 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06324904-81ca-4e85-8251-21625e7a904e-serving-cert\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.700063 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llb4f\" (UniqueName: \"kubernetes.io/projected/06324904-81ca-4e85-8251-21625e7a904e-kube-api-access-llb4f\") pod \"route-controller-manager-767c78c759-76pdj\" (UID: \"06324904-81ca-4e85-8251-21625e7a904e\") " pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.869471 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:03 crc kubenswrapper[4797]: I0930 17:55:03.962219 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f87c44b4-kf48m" Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.012487 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57f87c44b4-kf48m"] Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.014287 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57f87c44b4-kf48m"] Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.245173 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd5e175-89ae-4e6e-9134-56d10a1974c5" path="/var/lib/kubelet/pods/0dd5e175-89ae-4e6e-9134-56d10a1974c5/volumes" Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.245930 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1ebf89-17d5-4811-9ee9-05d7e50441e9" path="/var/lib/kubelet/pods/8d1ebf89-17d5-4811-9ee9-05d7e50441e9/volumes" Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.246521 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6d1982-3f25-41dd-930c-6c589c6400a5" path="/var/lib/kubelet/pods/ac6d1982-3f25-41dd-930c-6c589c6400a5/volumes" Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.291461 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj"] Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.969153 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" event={"ID":"06324904-81ca-4e85-8251-21625e7a904e","Type":"ContainerStarted","Data":"8be677aeb0eb3dbe286bda6f4eedb2b72dadbe4615c8d422d14577cacebf4199"} Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.969414 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" event={"ID":"06324904-81ca-4e85-8251-21625e7a904e","Type":"ContainerStarted","Data":"ff49f5d999c318d03ace0b9b5c66774cdb85037a5c3eb197c0d42e607baf7b89"} Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.969465 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:04 crc kubenswrapper[4797]: I0930 17:55:04.990643 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" podStartSLOduration=2.990624504 podStartE2EDuration="2.990624504s" podCreationTimestamp="2025-09-30 17:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:55:04.986951484 +0000 UTC m=+755.509450722" watchObservedRunningTime="2025-09-30 17:55:04.990624504 +0000 UTC m=+755.513123752" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.195187 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-767c78c759-76pdj" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.546615 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67594cfd4c-vnnz9"] Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.548029 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.550326 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.552088 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.552860 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.553227 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.553751 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.554240 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.561269 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.569276 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67594cfd4c-vnnz9"] Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.694357 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-client-ca\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.694464 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5lj\" (UniqueName: \"kubernetes.io/projected/9233b1a4-fa91-49ea-95b6-d94dd7c86996-kube-api-access-kt5lj\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.694506 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-proxy-ca-bundles\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.694571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-config\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.694602 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9233b1a4-fa91-49ea-95b6-d94dd7c86996-serving-cert\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.795958 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-config\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.796035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9233b1a4-fa91-49ea-95b6-d94dd7c86996-serving-cert\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.796108 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-client-ca\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.796179 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5lj\" (UniqueName: \"kubernetes.io/projected/9233b1a4-fa91-49ea-95b6-d94dd7c86996-kube-api-access-kt5lj\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.796220 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-proxy-ca-bundles\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.797533 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-config\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.797710 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-client-ca\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.798056 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9233b1a4-fa91-49ea-95b6-d94dd7c86996-proxy-ca-bundles\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.803512 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9233b1a4-fa91-49ea-95b6-d94dd7c86996-serving-cert\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.820635 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5lj\" (UniqueName: \"kubernetes.io/projected/9233b1a4-fa91-49ea-95b6-d94dd7c86996-kube-api-access-kt5lj\") pod \"controller-manager-67594cfd4c-vnnz9\" (UID: \"9233b1a4-fa91-49ea-95b6-d94dd7c86996\") " pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:05 crc kubenswrapper[4797]: I0930 17:55:05.866420 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:06 crc kubenswrapper[4797]: I0930 17:55:06.361188 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67594cfd4c-vnnz9"] Sep 30 17:55:06 crc kubenswrapper[4797]: W0930 17:55:06.373181 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9233b1a4_fa91_49ea_95b6_d94dd7c86996.slice/crio-6453cc4d3833b98c1eaf7a75b2c405229de0b263d1d6a0651894d6645c299d97 WatchSource:0}: Error finding container 6453cc4d3833b98c1eaf7a75b2c405229de0b263d1d6a0651894d6645c299d97: Status 404 returned error can't find the container with id 6453cc4d3833b98c1eaf7a75b2c405229de0b263d1d6a0651894d6645c299d97 Sep 30 17:55:06 crc kubenswrapper[4797]: I0930 17:55:06.980541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" event={"ID":"9233b1a4-fa91-49ea-95b6-d94dd7c86996","Type":"ContainerStarted","Data":"80c47856c0784b71b90fbdb8495ffa655eb52b31632f574cee9e825b0036e1ce"} Sep 30 17:55:06 crc kubenswrapper[4797]: I0930 17:55:06.981201 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" event={"ID":"9233b1a4-fa91-49ea-95b6-d94dd7c86996","Type":"ContainerStarted","Data":"6453cc4d3833b98c1eaf7a75b2c405229de0b263d1d6a0651894d6645c299d97"} Sep 30 17:55:07 crc kubenswrapper[4797]: I0930 17:55:07.004227 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" podStartSLOduration=5.004212176 podStartE2EDuration="5.004212176s" podCreationTimestamp="2025-09-30 17:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:55:07.001498942 +0000 UTC m=+757.523998180" watchObservedRunningTime="2025-09-30 17:55:07.004212176 +0000 UTC m=+757.526711414" Sep 30 17:55:07 crc kubenswrapper[4797]: I0930 17:55:07.985918 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:07 crc kubenswrapper[4797]: I0930 17:55:07.993950 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67594cfd4c-vnnz9" Sep 30 17:55:08 crc kubenswrapper[4797]: I0930 17:55:08.406389 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-r4wwh" Sep 30 17:55:10 crc kubenswrapper[4797]: I0930 17:55:10.653625 4797 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:55:14 crc kubenswrapper[4797]: I0930 17:55:14.192650 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:55:14 crc kubenswrapper[4797]: I0930 17:55:14.193020 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:55:23 crc kubenswrapper[4797]: I0930 17:55:23.089410 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" event={"ID":"e3897355-dd17-4112-94fd-42c45c4cfa7f","Type":"ContainerStarted","Data":"849aa06aab429dbb790384a7cb9efd69bd5a61c0a925c4deb68ff949ec6a27a6"} Sep 30 17:55:23 crc kubenswrapper[4797]: I0930 17:55:23.123908 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4rpbl" podStartSLOduration=2.660572093 podStartE2EDuration="36.123883234s" podCreationTimestamp="2025-09-30 17:54:47 +0000 UTC" firstStartedPulling="2025-09-30 17:54:48.320613088 +0000 UTC m=+738.843112326" lastFinishedPulling="2025-09-30 17:55:21.783924229 +0000 UTC m=+772.306423467" observedRunningTime="2025-09-30 17:55:23.119161354 +0000 UTC m=+773.641660632" watchObservedRunningTime="2025-09-30 17:55:23.123883234 +0000 UTC m=+773.646382512" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.706329 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z"] Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.708068 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.713528 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.725866 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z"] Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.773858 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch2lg\" (UniqueName: \"kubernetes.io/projected/90d265e8-7eec-4b51-8784-3ec3efab5526-kube-api-access-ch2lg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.773916 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.774053 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.875269 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch2lg\" (UniqueName: \"kubernetes.io/projected/90d265e8-7eec-4b51-8784-3ec3efab5526-kube-api-access-ch2lg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.875375 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.875489 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.876264 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.876527 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:40 crc kubenswrapper[4797]: I0930 17:55:40.901509 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch2lg\" (UniqueName: \"kubernetes.io/projected/90d265e8-7eec-4b51-8784-3ec3efab5526-kube-api-access-ch2lg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:41 crc kubenswrapper[4797]: I0930 17:55:41.032180 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:41 crc kubenswrapper[4797]: I0930 17:55:41.464730 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z"] Sep 30 17:55:42 crc kubenswrapper[4797]: I0930 17:55:42.201818 4797 generic.go:334] "Generic (PLEG): container finished" podID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerID="13818c123e8148126548152b31361ede3c0729cb55e53cd0fd999e27189b54c8" exitCode=0 Sep 30 17:55:42 crc kubenswrapper[4797]: I0930 17:55:42.201875 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" event={"ID":"90d265e8-7eec-4b51-8784-3ec3efab5526","Type":"ContainerDied","Data":"13818c123e8148126548152b31361ede3c0729cb55e53cd0fd999e27189b54c8"} Sep 30 17:55:42 crc kubenswrapper[4797]: I0930 17:55:42.203774 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" event={"ID":"90d265e8-7eec-4b51-8784-3ec3efab5526","Type":"ContainerStarted","Data":"9a56182f9323e1f0187be856d36914ff63a1070606445a65c89b532cca0b6db2"} Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.032211 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8dbcg"] Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.033993 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.050118 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dbcg"] Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.104570 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-utilities\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.108001 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvwf\" (UniqueName: \"kubernetes.io/projected/a851065c-6dac-4064-bd0f-89afccf148e5-kube-api-access-mgvwf\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.108124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-catalog-content\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.209336 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-utilities\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.209565 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvwf\" (UniqueName: \"kubernetes.io/projected/a851065c-6dac-4064-bd0f-89afccf148e5-kube-api-access-mgvwf\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.209605 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-catalog-content\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.210133 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-catalog-content\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.210508 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-utilities\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.239374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvwf\" (UniqueName: \"kubernetes.io/projected/a851065c-6dac-4064-bd0f-89afccf148e5-kube-api-access-mgvwf\") pod \"redhat-operators-8dbcg\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.349874 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:43 crc kubenswrapper[4797]: I0930 17:55:43.793552 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dbcg"] Sep 30 17:55:43 crc kubenswrapper[4797]: W0930 17:55:43.803665 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda851065c_6dac_4064_bd0f_89afccf148e5.slice/crio-c056fcb6d470ba29810f4e2f51c67531c112eca3a07600b300c2bc230e9ff965 WatchSource:0}: Error finding container c056fcb6d470ba29810f4e2f51c67531c112eca3a07600b300c2bc230e9ff965: Status 404 returned error can't find the container with id c056fcb6d470ba29810f4e2f51c67531c112eca3a07600b300c2bc230e9ff965 Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.192568 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.192911 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.192962 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.193543 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce8c5c7ec3f2afcaab85363de392623ca1e51f3441e1f3e66b88c01887d0f151"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.193599 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://ce8c5c7ec3f2afcaab85363de392623ca1e51f3441e1f3e66b88c01887d0f151" gracePeriod=600 Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.216123 4797 generic.go:334] "Generic (PLEG): container finished" podID="a851065c-6dac-4064-bd0f-89afccf148e5" containerID="9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff" exitCode=0 Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.216236 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dbcg" event={"ID":"a851065c-6dac-4064-bd0f-89afccf148e5","Type":"ContainerDied","Data":"9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff"} Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.216271 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dbcg" event={"ID":"a851065c-6dac-4064-bd0f-89afccf148e5","Type":"ContainerStarted","Data":"c056fcb6d470ba29810f4e2f51c67531c112eca3a07600b300c2bc230e9ff965"} Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.219045 4797 generic.go:334] "Generic (PLEG): container finished" podID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerID="3cb7e581340bc2b4b9350819c078740c9d688fce23e2f4c0c35011d03e30c178" exitCode=0 Sep 30 17:55:44 crc kubenswrapper[4797]: I0930 17:55:44.219097 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" event={"ID":"90d265e8-7eec-4b51-8784-3ec3efab5526","Type":"ContainerDied","Data":"3cb7e581340bc2b4b9350819c078740c9d688fce23e2f4c0c35011d03e30c178"} Sep 30 17:55:45 crc kubenswrapper[4797]: I0930 17:55:45.232096 4797 generic.go:334] "Generic (PLEG): container finished" podID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerID="c37472c047531689cc241317e9de27b9a31c9708a1cfb0ce9e75821e2c65e33f" exitCode=0 Sep 30 17:55:45 crc kubenswrapper[4797]: I0930 17:55:45.232178 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" event={"ID":"90d265e8-7eec-4b51-8784-3ec3efab5526","Type":"ContainerDied","Data":"c37472c047531689cc241317e9de27b9a31c9708a1cfb0ce9e75821e2c65e33f"} Sep 30 17:55:45 crc kubenswrapper[4797]: I0930 17:55:45.236004 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="ce8c5c7ec3f2afcaab85363de392623ca1e51f3441e1f3e66b88c01887d0f151" exitCode=0 Sep 30 17:55:45 crc kubenswrapper[4797]: I0930 17:55:45.236034 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"ce8c5c7ec3f2afcaab85363de392623ca1e51f3441e1f3e66b88c01887d0f151"} Sep 30 17:55:45 crc kubenswrapper[4797]: I0930 17:55:45.236053 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"24310c137eb65af07a384098fa62de96f749a8ebba9db197c7de2ab1bee41304"} Sep 30 17:55:45 crc kubenswrapper[4797]: I0930 17:55:45.236069 4797 scope.go:117] "RemoveContainer" containerID="7dd2c6a54e085cc3d196b4b71ebef6b3b0176665a4983ddfa820e07318e310ae" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.245283 4797 generic.go:334] "Generic (PLEG): container finished" podID="a851065c-6dac-4064-bd0f-89afccf148e5" containerID="4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283" exitCode=0 Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.257698 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dbcg" event={"ID":"a851065c-6dac-4064-bd0f-89afccf148e5","Type":"ContainerDied","Data":"4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283"} Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.537806 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.658760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-bundle\") pod \"90d265e8-7eec-4b51-8784-3ec3efab5526\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.658808 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch2lg\" (UniqueName: \"kubernetes.io/projected/90d265e8-7eec-4b51-8784-3ec3efab5526-kube-api-access-ch2lg\") pod \"90d265e8-7eec-4b51-8784-3ec3efab5526\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.658867 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-util\") pod \"90d265e8-7eec-4b51-8784-3ec3efab5526\" (UID: \"90d265e8-7eec-4b51-8784-3ec3efab5526\") " Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.659815 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-bundle" (OuterVolumeSpecName: "bundle") pod "90d265e8-7eec-4b51-8784-3ec3efab5526" (UID: "90d265e8-7eec-4b51-8784-3ec3efab5526"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.664648 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d265e8-7eec-4b51-8784-3ec3efab5526-kube-api-access-ch2lg" (OuterVolumeSpecName: "kube-api-access-ch2lg") pod "90d265e8-7eec-4b51-8784-3ec3efab5526" (UID: "90d265e8-7eec-4b51-8784-3ec3efab5526"). InnerVolumeSpecName "kube-api-access-ch2lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.671598 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-util" (OuterVolumeSpecName: "util") pod "90d265e8-7eec-4b51-8784-3ec3efab5526" (UID: "90d265e8-7eec-4b51-8784-3ec3efab5526"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.760073 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.760103 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch2lg\" (UniqueName: \"kubernetes.io/projected/90d265e8-7eec-4b51-8784-3ec3efab5526-kube-api-access-ch2lg\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:46 crc kubenswrapper[4797]: I0930 17:55:46.760114 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d265e8-7eec-4b51-8784-3ec3efab5526-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:47 crc kubenswrapper[4797]: I0930 17:55:47.258796 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dbcg" event={"ID":"a851065c-6dac-4064-bd0f-89afccf148e5","Type":"ContainerStarted","Data":"c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8"} Sep 30 17:55:47 crc kubenswrapper[4797]: I0930 17:55:47.263700 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" event={"ID":"90d265e8-7eec-4b51-8784-3ec3efab5526","Type":"ContainerDied","Data":"9a56182f9323e1f0187be856d36914ff63a1070606445a65c89b532cca0b6db2"} Sep 30 17:55:47 crc kubenswrapper[4797]: I0930 17:55:47.263761 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a56182f9323e1f0187be856d36914ff63a1070606445a65c89b532cca0b6db2" Sep 30 17:55:47 crc kubenswrapper[4797]: I0930 17:55:47.263967 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z" Sep 30 17:55:47 crc kubenswrapper[4797]: I0930 17:55:47.308665 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8dbcg" podStartSLOduration=1.6192408600000001 podStartE2EDuration="4.308639436s" podCreationTimestamp="2025-09-30 17:55:43 +0000 UTC" firstStartedPulling="2025-09-30 17:55:44.221721284 +0000 UTC m=+794.744220522" lastFinishedPulling="2025-09-30 17:55:46.91111985 +0000 UTC m=+797.433619098" observedRunningTime="2025-09-30 17:55:47.300277108 +0000 UTC m=+797.822776446" watchObservedRunningTime="2025-09-30 17:55:47.308639436 +0000 UTC m=+797.831138704" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.434747 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j"] Sep 30 17:55:49 crc kubenswrapper[4797]: E0930 17:55:49.435539 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="util" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.435555 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="util" Sep 30 17:55:49 crc kubenswrapper[4797]: E0930 17:55:49.435571 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="pull" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.435579 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="pull" Sep 30 17:55:49 crc kubenswrapper[4797]: E0930 17:55:49.435590 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="extract" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.435597 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="extract" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.435718 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d265e8-7eec-4b51-8784-3ec3efab5526" containerName="extract" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.436235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.437946 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.438222 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ljzws" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.446511 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j"] Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.460245 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.503311 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hj9\" (UniqueName: \"kubernetes.io/projected/4ce57370-6b4e-4a2a-be84-6cea546156ac-kube-api-access-22hj9\") pod \"nmstate-operator-5d6f6cfd66-d2f5j\" (UID: \"4ce57370-6b4e-4a2a-be84-6cea546156ac\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.605257 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj9\" (UniqueName: \"kubernetes.io/projected/4ce57370-6b4e-4a2a-be84-6cea546156ac-kube-api-access-22hj9\") pod \"nmstate-operator-5d6f6cfd66-d2f5j\" (UID: \"4ce57370-6b4e-4a2a-be84-6cea546156ac\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.627009 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hj9\" (UniqueName: \"kubernetes.io/projected/4ce57370-6b4e-4a2a-be84-6cea546156ac-kube-api-access-22hj9\") pod \"nmstate-operator-5d6f6cfd66-d2f5j\" (UID: \"4ce57370-6b4e-4a2a-be84-6cea546156ac\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.751819 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" Sep 30 17:55:49 crc kubenswrapper[4797]: I0930 17:55:49.975121 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j"] Sep 30 17:55:50 crc kubenswrapper[4797]: I0930 17:55:50.279878 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" event={"ID":"4ce57370-6b4e-4a2a-be84-6cea546156ac","Type":"ContainerStarted","Data":"75448034f9802a127849744282505bc361e83b6aa80dd7eff94025710f62cdd2"} Sep 30 17:55:53 crc kubenswrapper[4797]: I0930 17:55:53.351048 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:53 crc kubenswrapper[4797]: I0930 17:55:53.351413 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:53 crc kubenswrapper[4797]: I0930 17:55:53.394502 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:54 crc kubenswrapper[4797]: I0930 17:55:54.360382 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:55 crc kubenswrapper[4797]: I0930 17:55:55.314026 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" event={"ID":"4ce57370-6b4e-4a2a-be84-6cea546156ac","Type":"ContainerStarted","Data":"f0dd21d2e75404b33b05776c09b40d88459a46004dc30c5127d8395dab1d27c6"} Sep 30 17:55:55 crc kubenswrapper[4797]: I0930 17:55:55.346096 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-d2f5j" podStartSLOduration=1.77973706 podStartE2EDuration="6.346059066s" podCreationTimestamp="2025-09-30 17:55:49 +0000 UTC" firstStartedPulling="2025-09-30 17:55:49.982712155 +0000 UTC m=+800.505211393" lastFinishedPulling="2025-09-30 17:55:54.549034161 +0000 UTC m=+805.071533399" observedRunningTime="2025-09-30 17:55:55.339107407 +0000 UTC m=+805.861606685" watchObservedRunningTime="2025-09-30 17:55:55.346059066 +0000 UTC m=+805.868558344" Sep 30 17:55:55 crc kubenswrapper[4797]: I0930 17:55:55.626162 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8dbcg"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.347913 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8dbcg" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="registry-server" containerID="cri-o://c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8" gracePeriod=2 Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.427723 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.429691 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.433683 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5vsmj" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.437925 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.438745 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.440905 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.450570 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.467052 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-x2tc2"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.473352 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.500298 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.500859 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3417981c-8b7a-48f5-b504-c3a358706f7f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.500917 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc97s\" (UniqueName: \"kubernetes.io/projected/3417981c-8b7a-48f5-b504-c3a358706f7f-kube-api-access-rc97s\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.500944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9lbp\" (UniqueName: \"kubernetes.io/projected/5e6982a1-49c0-428e-8b68-38899f1be907-kube-api-access-w9lbp\") pod \"nmstate-metrics-58fcddf996-4nbj6\" (UID: \"5e6982a1-49c0-428e-8b68-38899f1be907\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.590088 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.593663 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.598807 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rsmm5" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.599042 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.600883 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602205 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc97s\" (UniqueName: \"kubernetes.io/projected/3417981c-8b7a-48f5-b504-c3a358706f7f-kube-api-access-rc97s\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602260 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9lbp\" (UniqueName: \"kubernetes.io/projected/5e6982a1-49c0-428e-8b68-38899f1be907-kube-api-access-w9lbp\") pod \"nmstate-metrics-58fcddf996-4nbj6\" (UID: \"5e6982a1-49c0-428e-8b68-38899f1be907\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602293 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-ovs-socket\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602326 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zs5m\" (UniqueName: \"kubernetes.io/projected/ccc6478c-07c2-431f-a964-1db62dd3800e-kube-api-access-6zs5m\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602378 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-nmstate-lock\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602446 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-dbus-socket\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.602469 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3417981c-8b7a-48f5-b504-c3a358706f7f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:56 crc kubenswrapper[4797]: E0930 17:55:56.602646 4797 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 17:55:56 crc kubenswrapper[4797]: E0930 17:55:56.602728 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3417981c-8b7a-48f5-b504-c3a358706f7f-tls-key-pair podName:3417981c-8b7a-48f5-b504-c3a358706f7f nodeName:}" failed. No retries permitted until 2025-09-30 17:55:57.102701893 +0000 UTC m=+807.625201131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/3417981c-8b7a-48f5-b504-c3a358706f7f-tls-key-pair") pod "nmstate-webhook-6d689559c5-bxqvs" (UID: "3417981c-8b7a-48f5-b504-c3a358706f7f") : secret "openshift-nmstate-webhook" not found Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.605077 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.626875 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9lbp\" (UniqueName: \"kubernetes.io/projected/5e6982a1-49c0-428e-8b68-38899f1be907-kube-api-access-w9lbp\") pod \"nmstate-metrics-58fcddf996-4nbj6\" (UID: \"5e6982a1-49c0-428e-8b68-38899f1be907\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.633879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc97s\" (UniqueName: \"kubernetes.io/projected/3417981c-8b7a-48f5-b504-c3a358706f7f-kube-api-access-rc97s\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705699 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-dbus-socket\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-ovs-socket\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705802 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zs5m\" (UniqueName: \"kubernetes.io/projected/ccc6478c-07c2-431f-a964-1db62dd3800e-kube-api-access-6zs5m\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705855 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-nmstate-lock\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705884 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeadbff5-abfe-4f8c-a24f-e62db0f23612-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705911 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cskw\" (UniqueName: \"kubernetes.io/projected/aeadbff5-abfe-4f8c-a24f-e62db0f23612-kube-api-access-8cskw\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.705936 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aeadbff5-abfe-4f8c-a24f-e62db0f23612-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.706313 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-dbus-socket\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.706354 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-ovs-socket\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.706668 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ccc6478c-07c2-431f-a964-1db62dd3800e-nmstate-lock\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.747752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zs5m\" (UniqueName: \"kubernetes.io/projected/ccc6478c-07c2-431f-a964-1db62dd3800e-kube-api-access-6zs5m\") pod \"nmstate-handler-x2tc2\" (UID: \"ccc6478c-07c2-431f-a964-1db62dd3800e\") " pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.806562 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeadbff5-abfe-4f8c-a24f-e62db0f23612-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.806902 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cskw\" (UniqueName: \"kubernetes.io/projected/aeadbff5-abfe-4f8c-a24f-e62db0f23612-kube-api-access-8cskw\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.806929 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aeadbff5-abfe-4f8c-a24f-e62db0f23612-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.807852 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aeadbff5-abfe-4f8c-a24f-e62db0f23612-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.810823 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aeadbff5-abfe-4f8c-a24f-e62db0f23612-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.821898 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.823037 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.829675 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cskw\" (UniqueName: \"kubernetes.io/projected/aeadbff5-abfe-4f8c-a24f-e62db0f23612-kube-api-access-8cskw\") pod \"nmstate-console-plugin-864bb6dfb5-g9n7q\" (UID: \"aeadbff5-abfe-4f8c-a24f-e62db0f23612\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.850036 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.908461 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-utilities\") pod \"a851065c-6dac-4064-bd0f-89afccf148e5\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.908530 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgvwf\" (UniqueName: \"kubernetes.io/projected/a851065c-6dac-4064-bd0f-89afccf148e5-kube-api-access-mgvwf\") pod \"a851065c-6dac-4064-bd0f-89afccf148e5\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.908600 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-catalog-content\") pod \"a851065c-6dac-4064-bd0f-89afccf148e5\" (UID: \"a851065c-6dac-4064-bd0f-89afccf148e5\") " Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.910124 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-utilities" (OuterVolumeSpecName: "utilities") pod "a851065c-6dac-4064-bd0f-89afccf148e5" (UID: "a851065c-6dac-4064-bd0f-89afccf148e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.912709 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a851065c-6dac-4064-bd0f-89afccf148e5-kube-api-access-mgvwf" (OuterVolumeSpecName: "kube-api-access-mgvwf") pod "a851065c-6dac-4064-bd0f-89afccf148e5" (UID: "a851065c-6dac-4064-bd0f-89afccf148e5"). InnerVolumeSpecName "kube-api-access-mgvwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.919723 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.938521 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb7c8495b-75wdc"] Sep 30 17:55:56 crc kubenswrapper[4797]: E0930 17:55:56.938970 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="extract-content" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.938994 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="extract-content" Sep 30 17:55:56 crc kubenswrapper[4797]: E0930 17:55:56.939007 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="registry-server" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.939013 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="registry-server" Sep 30 17:55:56 crc kubenswrapper[4797]: E0930 17:55:56.939024 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="extract-utilities" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.939032 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="extract-utilities" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.939144 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" containerName="registry-server" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.939736 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.955940 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7c8495b-75wdc"] Sep 30 17:55:56 crc kubenswrapper[4797]: I0930 17:55:56.994452 4797 scope.go:117] "RemoveContainer" containerID="52d2376499e3173decc31e80bf630b5e077ed09ca763b5ae35de6ec56008e5f8" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.006780 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a851065c-6dac-4064-bd0f-89afccf148e5" (UID: "a851065c-6dac-4064-bd0f-89afccf148e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012035 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-serving-cert\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012074 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsxk\" (UniqueName: \"kubernetes.io/projected/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-kube-api-access-hdsxk\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012108 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-config\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012138 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-oauth-serving-cert\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012160 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-service-ca\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012177 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-oauth-config\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012213 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-trusted-ca-bundle\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012266 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012285 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgvwf\" (UniqueName: \"kubernetes.io/projected/a851065c-6dac-4064-bd0f-89afccf148e5-kube-api-access-mgvwf\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.012296 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a851065c-6dac-4064-bd0f-89afccf148e5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-config\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114069 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-oauth-serving-cert\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114096 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-service-ca\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-oauth-config\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114154 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-trusted-ca-bundle\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114185 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3417981c-8b7a-48f5-b504-c3a358706f7f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-serving-cert\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.114277 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsxk\" (UniqueName: \"kubernetes.io/projected/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-kube-api-access-hdsxk\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.115217 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-service-ca\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.115944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-trusted-ca-bundle\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.115978 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-oauth-serving-cert\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.116558 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-config\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.119678 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3417981c-8b7a-48f5-b504-c3a358706f7f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bxqvs\" (UID: \"3417981c-8b7a-48f5-b504-c3a358706f7f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.119697 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-oauth-config\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.120480 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-console-serving-cert\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.130225 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsxk\" (UniqueName: \"kubernetes.io/projected/fcbdae5d-d4f4-4499-bb30-8282d22ae4a3-kube-api-access-hdsxk\") pod \"console-6fb7c8495b-75wdc\" (UID: \"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3\") " pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.135235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.298362 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.356456 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6"] Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.357810 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-x2tc2" event={"ID":"ccc6478c-07c2-431f-a964-1db62dd3800e","Type":"ContainerStarted","Data":"85ba32e3b12f3a09f947ea3001a62b733d592354fa0ba4a03bffa0cda6b9e739"} Sep 30 17:55:57 crc kubenswrapper[4797]: W0930 17:55:57.370802 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6982a1_49c0_428e_8b68_38899f1be907.slice/crio-92dc8a7b468c7005fc3c73f52115a0032f3ad45f0e74b1f2bb83d0fc510257c4 WatchSource:0}: Error finding container 92dc8a7b468c7005fc3c73f52115a0032f3ad45f0e74b1f2bb83d0fc510257c4: Status 404 returned error can't find the container with id 92dc8a7b468c7005fc3c73f52115a0032f3ad45f0e74b1f2bb83d0fc510257c4 Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.372353 4797 generic.go:334] "Generic (PLEG): container finished" podID="a851065c-6dac-4064-bd0f-89afccf148e5" containerID="c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8" exitCode=0 Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.372403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dbcg" event={"ID":"a851065c-6dac-4064-bd0f-89afccf148e5","Type":"ContainerDied","Data":"c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8"} Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.372450 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dbcg" event={"ID":"a851065c-6dac-4064-bd0f-89afccf148e5","Type":"ContainerDied","Data":"c056fcb6d470ba29810f4e2f51c67531c112eca3a07600b300c2bc230e9ff965"} Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.372471 4797 scope.go:117] "RemoveContainer" containerID="c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.372609 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dbcg" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.413873 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8dbcg"] Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.417831 4797 scope.go:117] "RemoveContainer" containerID="4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.445420 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8dbcg"] Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.467622 4797 scope.go:117] "RemoveContainer" containerID="9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.479416 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q"] Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.539204 4797 scope.go:117] "RemoveContainer" containerID="c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8" Sep 30 17:55:57 crc kubenswrapper[4797]: E0930 17:55:57.541772 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8\": container with ID starting with c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8 not found: ID does not exist" containerID="c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.541847 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8"} err="failed to get container status \"c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8\": rpc error: code = NotFound desc = could not find container \"c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8\": container with ID starting with c596411034760705e2d5e49c29f2ebcde0c6d7e8a8d04865931f58a015e003d8 not found: ID does not exist" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.541877 4797 scope.go:117] "RemoveContainer" containerID="4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283" Sep 30 17:55:57 crc kubenswrapper[4797]: E0930 17:55:57.552174 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283\": container with ID starting with 4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283 not found: ID does not exist" containerID="4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.552237 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283"} err="failed to get container status \"4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283\": rpc error: code = NotFound desc = could not find container \"4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283\": container with ID starting with 4f0404f04eed0f20e9015d17a0f1954ef8a805b26efbf771b27877357503e283 not found: ID does not exist" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.552274 4797 scope.go:117] "RemoveContainer" containerID="9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff" Sep 30 17:55:57 crc kubenswrapper[4797]: E0930 17:55:57.552880 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff\": container with ID starting with 9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff not found: ID does not exist" containerID="9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.552939 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff"} err="failed to get container status \"9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff\": rpc error: code = NotFound desc = could not find container \"9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff\": container with ID starting with 9aab6f0b2cd4c781c6c1ec6fa40fbb3d68a732a7982c6af716bfb738720b4aff not found: ID does not exist" Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.553919 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs"] Sep 30 17:55:57 crc kubenswrapper[4797]: I0930 17:55:57.568871 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7c8495b-75wdc"] Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.257980 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a851065c-6dac-4064-bd0f-89afccf148e5" path="/var/lib/kubelet/pods/a851065c-6dac-4064-bd0f-89afccf148e5/volumes" Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.389571 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7c8495b-75wdc" event={"ID":"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3","Type":"ContainerStarted","Data":"632ddac5f8b4cb9a6ba944a2aa179a28d047afe839e60703cae96715508f4a57"} Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.390028 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7c8495b-75wdc" event={"ID":"fcbdae5d-d4f4-4499-bb30-8282d22ae4a3","Type":"ContainerStarted","Data":"630156bbf60575ccfe509a337a81de02b29eff943592c673b0f3d9ae65642352"} Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.392829 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" event={"ID":"3417981c-8b7a-48f5-b504-c3a358706f7f","Type":"ContainerStarted","Data":"995291ecd0dfc0ecfc8acea995bea9f85684820e761da2402dca7078b4fcd5b7"} Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.395261 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" event={"ID":"aeadbff5-abfe-4f8c-a24f-e62db0f23612","Type":"ContainerStarted","Data":"2a9aa97a120fe77102f588c8485983cba83381bcfcac2b2ffa09f016d5636823"} Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.405473 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" event={"ID":"5e6982a1-49c0-428e-8b68-38899f1be907","Type":"ContainerStarted","Data":"92dc8a7b468c7005fc3c73f52115a0032f3ad45f0e74b1f2bb83d0fc510257c4"} Sep 30 17:55:58 crc kubenswrapper[4797]: I0930 17:55:58.414141 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb7c8495b-75wdc" podStartSLOduration=2.414124855 podStartE2EDuration="2.414124855s" podCreationTimestamp="2025-09-30 17:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:55:58.412586313 +0000 UTC m=+808.935085571" watchObservedRunningTime="2025-09-30 17:55:58.414124855 +0000 UTC m=+808.936624103" Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.434413 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" event={"ID":"3417981c-8b7a-48f5-b504-c3a358706f7f","Type":"ContainerStarted","Data":"7915ed0696ff7fcd12dccc326c62693a2ef5bce9bc96be780230a361ea0178d9"} Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.435741 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.438029 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-x2tc2" event={"ID":"ccc6478c-07c2-431f-a964-1db62dd3800e","Type":"ContainerStarted","Data":"32a0e574143822f6d2f687e1d6f9abdb6bd001df50e8539efdcb014cfe03fcca"} Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.438849 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.440517 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" event={"ID":"aeadbff5-abfe-4f8c-a24f-e62db0f23612","Type":"ContainerStarted","Data":"4e55622516e8a40e9e92876acd8accc05ae1c703a1f9473e6cd80ade602f867a"} Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.442610 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" event={"ID":"5e6982a1-49c0-428e-8b68-38899f1be907","Type":"ContainerStarted","Data":"4820dee796892bbea91c62a4a316bb41ddf649df9bf3daeb6d23c50652f0bf08"} Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.473632 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" podStartSLOduration=2.77642571 podStartE2EDuration="5.473613439s" podCreationTimestamp="2025-09-30 17:55:56 +0000 UTC" firstStartedPulling="2025-09-30 17:55:57.569155801 +0000 UTC m=+808.091655039" lastFinishedPulling="2025-09-30 17:56:00.26634349 +0000 UTC m=+810.788842768" observedRunningTime="2025-09-30 17:56:01.467314346 +0000 UTC m=+811.989813624" watchObservedRunningTime="2025-09-30 17:56:01.473613439 +0000 UTC m=+811.996112687" Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.494876 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-g9n7q" podStartSLOduration=2.763766925 podStartE2EDuration="5.494850958s" podCreationTimestamp="2025-09-30 17:55:56 +0000 UTC" firstStartedPulling="2025-09-30 17:55:57.511878459 +0000 UTC m=+808.034377697" lastFinishedPulling="2025-09-30 17:56:00.242962482 +0000 UTC m=+810.765461730" observedRunningTime="2025-09-30 17:56:01.492514514 +0000 UTC m=+812.015013802" watchObservedRunningTime="2025-09-30 17:56:01.494850958 +0000 UTC m=+812.017350236" Sep 30 17:56:01 crc kubenswrapper[4797]: I0930 17:56:01.516769 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-x2tc2" podStartSLOduration=2.193811954 podStartE2EDuration="5.516742055s" podCreationTimestamp="2025-09-30 17:55:56 +0000 UTC" firstStartedPulling="2025-09-30 17:55:56.925622483 +0000 UTC m=+807.448121721" lastFinishedPulling="2025-09-30 17:56:00.248552574 +0000 UTC m=+810.771051822" observedRunningTime="2025-09-30 17:56:01.512545351 +0000 UTC m=+812.035044589" watchObservedRunningTime="2025-09-30 17:56:01.516742055 +0000 UTC m=+812.039241333" Sep 30 17:56:03 crc kubenswrapper[4797]: I0930 17:56:03.459414 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" event={"ID":"5e6982a1-49c0-428e-8b68-38899f1be907","Type":"ContainerStarted","Data":"adb5805ec491dab35bdc6be9fea7fec130dd0db42ccd28edf4f65a1e5b9d65fa"} Sep 30 17:56:03 crc kubenswrapper[4797]: I0930 17:56:03.480526 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-4nbj6" podStartSLOduration=2.008038585 podStartE2EDuration="7.480506554s" podCreationTimestamp="2025-09-30 17:55:56 +0000 UTC" firstStartedPulling="2025-09-30 17:55:57.38291712 +0000 UTC m=+807.905416358" lastFinishedPulling="2025-09-30 17:56:02.855385089 +0000 UTC m=+813.377884327" observedRunningTime="2025-09-30 17:56:03.477095051 +0000 UTC m=+813.999594289" watchObservedRunningTime="2025-09-30 17:56:03.480506554 +0000 UTC m=+814.003005792" Sep 30 17:56:06 crc kubenswrapper[4797]: I0930 17:56:06.872539 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-x2tc2" Sep 30 17:56:07 crc kubenswrapper[4797]: I0930 17:56:07.299546 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:56:07 crc kubenswrapper[4797]: I0930 17:56:07.299793 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:56:07 crc kubenswrapper[4797]: I0930 17:56:07.306030 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:56:07 crc kubenswrapper[4797]: I0930 17:56:07.488123 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fb7c8495b-75wdc" Sep 30 17:56:07 crc kubenswrapper[4797]: I0930 17:56:07.558583 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ngfnz"] Sep 30 17:56:17 crc kubenswrapper[4797]: I0930 17:56:17.148025 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bxqvs" Sep 30 17:56:32 crc kubenswrapper[4797]: I0930 17:56:32.621860 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ngfnz" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerName="console" containerID="cri-o://a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7" gracePeriod=15 Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.008450 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ngfnz_755b943f-97b9-4ad6-b2cc-0f4c11d62fc0/console/0.log" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.008851 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085141 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdwjx\" (UniqueName: \"kubernetes.io/projected/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-kube-api-access-bdwjx\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085283 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-config\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085352 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-service-ca\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085391 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-serving-cert\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085499 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-oauth-config\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085559 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-oauth-serving-cert\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.085605 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-trusted-ca-bundle\") pod \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\" (UID: \"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0\") " Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.087372 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.087402 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.087608 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.088597 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-config" (OuterVolumeSpecName: "console-config") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.109767 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.113212 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.120777 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-kube-api-access-bdwjx" (OuterVolumeSpecName: "kube-api-access-bdwjx") pod "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" (UID: "755b943f-97b9-4ad6-b2cc-0f4c11d62fc0"). InnerVolumeSpecName "kube-api-access-bdwjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189049 4797 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189088 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdwjx\" (UniqueName: \"kubernetes.io/projected/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-kube-api-access-bdwjx\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189103 4797 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189115 4797 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189127 4797 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189138 4797 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.189150 4797 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.458602 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78"] Sep 30 17:56:33 crc kubenswrapper[4797]: E0930 17:56:33.458827 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerName="console" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.458839 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerName="console" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.458944 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerName="console" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.459688 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.461903 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.473542 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78"] Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.594230 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.594286 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx7m4\" (UniqueName: \"kubernetes.io/projected/44d64586-8967-4e80-809d-e205470ca444-kube-api-access-xx7m4\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.594313 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.696048 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.696108 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx7m4\" (UniqueName: \"kubernetes.io/projected/44d64586-8967-4e80-809d-e205470ca444-kube-api-access-xx7m4\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.696134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.696788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.697160 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.707882 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ngfnz_755b943f-97b9-4ad6-b2cc-0f4c11d62fc0/console/0.log" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.707942 4797 generic.go:334] "Generic (PLEG): container finished" podID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" containerID="a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7" exitCode=2 Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.707979 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngfnz" event={"ID":"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0","Type":"ContainerDied","Data":"a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7"} Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.708012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngfnz" event={"ID":"755b943f-97b9-4ad6-b2cc-0f4c11d62fc0","Type":"ContainerDied","Data":"50419cb32ecb5f89c82a214c669506aa90c9bad7d159a6231dc58f24b6b939a6"} Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.708031 4797 scope.go:117] "RemoveContainer" containerID="a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.708174 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngfnz" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.721299 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx7m4\" (UniqueName: \"kubernetes.io/projected/44d64586-8967-4e80-809d-e205470ca444-kube-api-access-xx7m4\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.744787 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ngfnz"] Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.748859 4797 scope.go:117] "RemoveContainer" containerID="a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7" Sep 30 17:56:33 crc kubenswrapper[4797]: E0930 17:56:33.749755 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7\": container with ID starting with a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7 not found: ID does not exist" containerID="a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.749822 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7"} err="failed to get container status \"a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7\": rpc error: code = NotFound desc = could not find container \"a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7\": container with ID starting with a42a428146eff4c5e6c5a097a0e322523961a4cdb47f01332a5dd43a5203f7b7 not found: ID does not exist" Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.749902 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ngfnz"] Sep 30 17:56:33 crc kubenswrapper[4797]: I0930 17:56:33.775733 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:34 crc kubenswrapper[4797]: I0930 17:56:34.244636 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755b943f-97b9-4ad6-b2cc-0f4c11d62fc0" path="/var/lib/kubelet/pods/755b943f-97b9-4ad6-b2cc-0f4c11d62fc0/volumes" Sep 30 17:56:34 crc kubenswrapper[4797]: I0930 17:56:34.246528 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78"] Sep 30 17:56:34 crc kubenswrapper[4797]: W0930 17:56:34.252205 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d64586_8967_4e80_809d_e205470ca444.slice/crio-c8530c458be05c4d0fa4cb12b981b15621dd1c9191d025e2e8f903a117ee49a6 WatchSource:0}: Error finding container c8530c458be05c4d0fa4cb12b981b15621dd1c9191d025e2e8f903a117ee49a6: Status 404 returned error can't find the container with id c8530c458be05c4d0fa4cb12b981b15621dd1c9191d025e2e8f903a117ee49a6 Sep 30 17:56:34 crc kubenswrapper[4797]: I0930 17:56:34.716077 4797 generic.go:334] "Generic (PLEG): container finished" podID="44d64586-8967-4e80-809d-e205470ca444" containerID="600bd2f64682f32d80c94866dcb1a10fc35d633a1ac64129d41900b4b982996b" exitCode=0 Sep 30 17:56:34 crc kubenswrapper[4797]: I0930 17:56:34.716170 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" event={"ID":"44d64586-8967-4e80-809d-e205470ca444","Type":"ContainerDied","Data":"600bd2f64682f32d80c94866dcb1a10fc35d633a1ac64129d41900b4b982996b"} Sep 30 17:56:34 crc kubenswrapper[4797]: I0930 17:56:34.718652 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" event={"ID":"44d64586-8967-4e80-809d-e205470ca444","Type":"ContainerStarted","Data":"c8530c458be05c4d0fa4cb12b981b15621dd1c9191d025e2e8f903a117ee49a6"} Sep 30 17:56:36 crc kubenswrapper[4797]: I0930 17:56:36.734678 4797 generic.go:334] "Generic (PLEG): container finished" podID="44d64586-8967-4e80-809d-e205470ca444" containerID="31a7191fc0a28b65b8cb3837e5a3629f941bc7d94228fc11a52f68ad83b0ac9d" exitCode=0 Sep 30 17:56:36 crc kubenswrapper[4797]: I0930 17:56:36.734749 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" event={"ID":"44d64586-8967-4e80-809d-e205470ca444","Type":"ContainerDied","Data":"31a7191fc0a28b65b8cb3837e5a3629f941bc7d94228fc11a52f68ad83b0ac9d"} Sep 30 17:56:37 crc kubenswrapper[4797]: I0930 17:56:37.746385 4797 generic.go:334] "Generic (PLEG): container finished" podID="44d64586-8967-4e80-809d-e205470ca444" containerID="083e72b5a3c5606906e29b682af932f9aa5c9ce5b5c27f82d9a0eac8bda06367" exitCode=0 Sep 30 17:56:37 crc kubenswrapper[4797]: I0930 17:56:37.746564 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" event={"ID":"44d64586-8967-4e80-809d-e205470ca444","Type":"ContainerDied","Data":"083e72b5a3c5606906e29b682af932f9aa5c9ce5b5c27f82d9a0eac8bda06367"} Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.102839 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.185844 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-bundle\") pod \"44d64586-8967-4e80-809d-e205470ca444\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.185946 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-util\") pod \"44d64586-8967-4e80-809d-e205470ca444\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.186069 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx7m4\" (UniqueName: \"kubernetes.io/projected/44d64586-8967-4e80-809d-e205470ca444-kube-api-access-xx7m4\") pod \"44d64586-8967-4e80-809d-e205470ca444\" (UID: \"44d64586-8967-4e80-809d-e205470ca444\") " Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.188599 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-bundle" (OuterVolumeSpecName: "bundle") pod "44d64586-8967-4e80-809d-e205470ca444" (UID: "44d64586-8967-4e80-809d-e205470ca444"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.195489 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d64586-8967-4e80-809d-e205470ca444-kube-api-access-xx7m4" (OuterVolumeSpecName: "kube-api-access-xx7m4") pod "44d64586-8967-4e80-809d-e205470ca444" (UID: "44d64586-8967-4e80-809d-e205470ca444"). InnerVolumeSpecName "kube-api-access-xx7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.219415 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-util" (OuterVolumeSpecName: "util") pod "44d64586-8967-4e80-809d-e205470ca444" (UID: "44d64586-8967-4e80-809d-e205470ca444"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.288011 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx7m4\" (UniqueName: \"kubernetes.io/projected/44d64586-8967-4e80-809d-e205470ca444-kube-api-access-xx7m4\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.288065 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.288082 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44d64586-8967-4e80-809d-e205470ca444-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.762711 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" event={"ID":"44d64586-8967-4e80-809d-e205470ca444","Type":"ContainerDied","Data":"c8530c458be05c4d0fa4cb12b981b15621dd1c9191d025e2e8f903a117ee49a6"} Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.762761 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8530c458be05c4d0fa4cb12b981b15621dd1c9191d025e2e8f903a117ee49a6" Sep 30 17:56:39 crc kubenswrapper[4797]: I0930 17:56:39.763283 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78" Sep 30 17:56:41 crc kubenswrapper[4797]: E0930 17:56:41.727132 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.835267 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz"] Sep 30 17:56:48 crc kubenswrapper[4797]: E0930 17:56:48.836001 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="util" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.836015 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="util" Sep 30 17:56:48 crc kubenswrapper[4797]: E0930 17:56:48.836034 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="pull" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.836042 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="pull" Sep 30 17:56:48 crc kubenswrapper[4797]: E0930 17:56:48.836055 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="extract" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.836062 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="extract" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.836153 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d64586-8967-4e80-809d-e205470ca444" containerName="extract" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.836567 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.839250 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.839474 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.841806 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.842322 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.850972 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5fwsj" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.869291 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz"] Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.927481 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a1e68f-3d10-4da9-82db-d1043c94bcd8-apiservice-cert\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.927567 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a1e68f-3d10-4da9-82db-d1043c94bcd8-webhook-cert\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:48 crc kubenswrapper[4797]: I0930 17:56:48.927599 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7cx\" (UniqueName: \"kubernetes.io/projected/10a1e68f-3d10-4da9-82db-d1043c94bcd8-kube-api-access-rd7cx\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.028116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a1e68f-3d10-4da9-82db-d1043c94bcd8-webhook-cert\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.028386 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7cx\" (UniqueName: \"kubernetes.io/projected/10a1e68f-3d10-4da9-82db-d1043c94bcd8-kube-api-access-rd7cx\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.028413 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a1e68f-3d10-4da9-82db-d1043c94bcd8-apiservice-cert\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.037065 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a1e68f-3d10-4da9-82db-d1043c94bcd8-webhook-cert\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.038235 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a1e68f-3d10-4da9-82db-d1043c94bcd8-apiservice-cert\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.056311 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7cx\" (UniqueName: \"kubernetes.io/projected/10a1e68f-3d10-4da9-82db-d1043c94bcd8-kube-api-access-rd7cx\") pod \"metallb-operator-controller-manager-6cb9fbcf6-fj2qz\" (UID: \"10a1e68f-3d10-4da9-82db-d1043c94bcd8\") " pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.200700 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.291687 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz"] Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.292422 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.294263 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-rbtmq" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.294815 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.309387 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.321060 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz"] Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.331741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr6sr\" (UniqueName: \"kubernetes.io/projected/77965127-1121-47ad-96b5-34229a106e24-kube-api-access-jr6sr\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.331882 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77965127-1121-47ad-96b5-34229a106e24-apiservice-cert\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.331955 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77965127-1121-47ad-96b5-34229a106e24-webhook-cert\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.433262 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77965127-1121-47ad-96b5-34229a106e24-apiservice-cert\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.433357 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77965127-1121-47ad-96b5-34229a106e24-webhook-cert\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.433391 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr6sr\" (UniqueName: \"kubernetes.io/projected/77965127-1121-47ad-96b5-34229a106e24-kube-api-access-jr6sr\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.437078 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77965127-1121-47ad-96b5-34229a106e24-apiservice-cert\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.439871 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77965127-1121-47ad-96b5-34229a106e24-webhook-cert\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.453463 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr6sr\" (UniqueName: \"kubernetes.io/projected/77965127-1121-47ad-96b5-34229a106e24-kube-api-access-jr6sr\") pod \"metallb-operator-webhook-server-6f4f79dfb8-6s6zz\" (UID: \"77965127-1121-47ad-96b5-34229a106e24\") " pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.609538 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.682377 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz"] Sep 30 17:56:49 crc kubenswrapper[4797]: W0930 17:56:49.692118 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a1e68f_3d10_4da9_82db_d1043c94bcd8.slice/crio-d37bd5a066e01f8561eb04721ed69b79cbb6cc447bff2e604adff653ece9f648 WatchSource:0}: Error finding container d37bd5a066e01f8561eb04721ed69b79cbb6cc447bff2e604adff653ece9f648: Status 404 returned error can't find the container with id d37bd5a066e01f8561eb04721ed69b79cbb6cc447bff2e604adff653ece9f648 Sep 30 17:56:49 crc kubenswrapper[4797]: I0930 17:56:49.828248 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" event={"ID":"10a1e68f-3d10-4da9-82db-d1043c94bcd8","Type":"ContainerStarted","Data":"d37bd5a066e01f8561eb04721ed69b79cbb6cc447bff2e604adff653ece9f648"} Sep 30 17:56:50 crc kubenswrapper[4797]: I0930 17:56:50.110595 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz"] Sep 30 17:56:50 crc kubenswrapper[4797]: I0930 17:56:50.833682 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" event={"ID":"77965127-1121-47ad-96b5-34229a106e24","Type":"ContainerStarted","Data":"621b4e511c26bf9ec515644bf6c50ec799608f56b9aeeda1626210963bdcb62a"} Sep 30 17:56:53 crc kubenswrapper[4797]: I0930 17:56:53.852141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" event={"ID":"10a1e68f-3d10-4da9-82db-d1043c94bcd8","Type":"ContainerStarted","Data":"4cd7a87478cc07f090cf9dec50fd31ebc889ed896775becc5f7ad2a6db87f78d"} Sep 30 17:56:53 crc kubenswrapper[4797]: I0930 17:56:53.853635 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:56:53 crc kubenswrapper[4797]: I0930 17:56:53.888561 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" podStartSLOduration=2.124965543 podStartE2EDuration="5.888532287s" podCreationTimestamp="2025-09-30 17:56:48 +0000 UTC" firstStartedPulling="2025-09-30 17:56:49.695396283 +0000 UTC m=+860.217895531" lastFinishedPulling="2025-09-30 17:56:53.458963037 +0000 UTC m=+863.981462275" observedRunningTime="2025-09-30 17:56:53.878650888 +0000 UTC m=+864.401150146" watchObservedRunningTime="2025-09-30 17:56:53.888532287 +0000 UTC m=+864.411031575" Sep 30 17:56:55 crc kubenswrapper[4797]: I0930 17:56:55.867397 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" event={"ID":"77965127-1121-47ad-96b5-34229a106e24","Type":"ContainerStarted","Data":"330e637d621976e1b9de371162d3472b12c86b049ffa97fd6db92407374859c7"} Sep 30 17:56:55 crc kubenswrapper[4797]: I0930 17:56:55.867819 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.857663 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" podStartSLOduration=11.707121584 podStartE2EDuration="16.857647159s" podCreationTimestamp="2025-09-30 17:56:49 +0000 UTC" firstStartedPulling="2025-09-30 17:56:50.1307102 +0000 UTC m=+860.653209438" lastFinishedPulling="2025-09-30 17:56:55.281235735 +0000 UTC m=+865.803735013" observedRunningTime="2025-09-30 17:56:55.95323082 +0000 UTC m=+866.475730078" watchObservedRunningTime="2025-09-30 17:57:05.857647159 +0000 UTC m=+876.380146387" Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.858361 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxc9q"] Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.859708 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.875461 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxc9q"] Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.993015 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-utilities\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.993074 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-catalog-content\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:05 crc kubenswrapper[4797]: I0930 17:57:05.993096 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/f9004bae-1fdc-4e97-8c61-f6e070867092-kube-api-access-8fd62\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.094554 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-utilities\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.094810 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-catalog-content\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.094838 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/f9004bae-1fdc-4e97-8c61-f6e070867092-kube-api-access-8fd62\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.095081 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-utilities\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.095406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-catalog-content\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.116331 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/f9004bae-1fdc-4e97-8c61-f6e070867092-kube-api-access-8fd62\") pod \"certified-operators-qxc9q\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.180225 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.758072 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxc9q"] Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.976603 4797 generic.go:334] "Generic (PLEG): container finished" podID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerID="b8025f874aa383c89557a7ce8dd9fe52eb892408c866d2615f81d336b2836fdf" exitCode=0 Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.976641 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerDied","Data":"b8025f874aa383c89557a7ce8dd9fe52eb892408c866d2615f81d336b2836fdf"} Sep 30 17:57:06 crc kubenswrapper[4797]: I0930 17:57:06.976665 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerStarted","Data":"8073db652baed536f84fadb9cdba5f81c408c31f5491f6081b9744607f10efc7"} Sep 30 17:57:07 crc kubenswrapper[4797]: I0930 17:57:07.985240 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerStarted","Data":"6f987c4a320145d3b7657082f918953a09f41ea08391ec82b5f16d57584cab78"} Sep 30 17:57:08 crc kubenswrapper[4797]: I0930 17:57:08.994086 4797 generic.go:334] "Generic (PLEG): container finished" podID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerID="6f987c4a320145d3b7657082f918953a09f41ea08391ec82b5f16d57584cab78" exitCode=0 Sep 30 17:57:08 crc kubenswrapper[4797]: I0930 17:57:08.994186 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerDied","Data":"6f987c4a320145d3b7657082f918953a09f41ea08391ec82b5f16d57584cab78"} Sep 30 17:57:09 crc kubenswrapper[4797]: I0930 17:57:09.614034 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6f4f79dfb8-6s6zz" Sep 30 17:57:11 crc kubenswrapper[4797]: I0930 17:57:11.006009 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerStarted","Data":"5d4d9d3bcc32d9709b7a3fcc58ea4565e66fa32a0745a9bd831cb566cbbeeb00"} Sep 30 17:57:11 crc kubenswrapper[4797]: I0930 17:57:11.023618 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxc9q" podStartSLOduration=3.056295586 podStartE2EDuration="6.023602415s" podCreationTimestamp="2025-09-30 17:57:05 +0000 UTC" firstStartedPulling="2025-09-30 17:57:06.978491079 +0000 UTC m=+877.500990317" lastFinishedPulling="2025-09-30 17:57:09.945797908 +0000 UTC m=+880.468297146" observedRunningTime="2025-09-30 17:57:11.020074328 +0000 UTC m=+881.542573566" watchObservedRunningTime="2025-09-30 17:57:11.023602415 +0000 UTC m=+881.546101653" Sep 30 17:57:16 crc kubenswrapper[4797]: I0930 17:57:16.180921 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:16 crc kubenswrapper[4797]: I0930 17:57:16.181658 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:16 crc kubenswrapper[4797]: I0930 17:57:16.220121 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:17 crc kubenswrapper[4797]: I0930 17:57:17.107488 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:17 crc kubenswrapper[4797]: I0930 17:57:17.156426 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxc9q"] Sep 30 17:57:19 crc kubenswrapper[4797]: I0930 17:57:19.061825 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qxc9q" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="registry-server" containerID="cri-o://5d4d9d3bcc32d9709b7a3fcc58ea4565e66fa32a0745a9bd831cb566cbbeeb00" gracePeriod=2 Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.084707 4797 generic.go:334] "Generic (PLEG): container finished" podID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerID="5d4d9d3bcc32d9709b7a3fcc58ea4565e66fa32a0745a9bd831cb566cbbeeb00" exitCode=0 Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.084809 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerDied","Data":"5d4d9d3bcc32d9709b7a3fcc58ea4565e66fa32a0745a9bd831cb566cbbeeb00"} Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.624618 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.789082 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-utilities\") pod \"f9004bae-1fdc-4e97-8c61-f6e070867092\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.789139 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/f9004bae-1fdc-4e97-8c61-f6e070867092-kube-api-access-8fd62\") pod \"f9004bae-1fdc-4e97-8c61-f6e070867092\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.789255 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-catalog-content\") pod \"f9004bae-1fdc-4e97-8c61-f6e070867092\" (UID: \"f9004bae-1fdc-4e97-8c61-f6e070867092\") " Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.791149 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-utilities" (OuterVolumeSpecName: "utilities") pod "f9004bae-1fdc-4e97-8c61-f6e070867092" (UID: "f9004bae-1fdc-4e97-8c61-f6e070867092"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.810678 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9004bae-1fdc-4e97-8c61-f6e070867092-kube-api-access-8fd62" (OuterVolumeSpecName: "kube-api-access-8fd62") pod "f9004bae-1fdc-4e97-8c61-f6e070867092" (UID: "f9004bae-1fdc-4e97-8c61-f6e070867092"). InnerVolumeSpecName "kube-api-access-8fd62". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.834717 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9004bae-1fdc-4e97-8c61-f6e070867092" (UID: "f9004bae-1fdc-4e97-8c61-f6e070867092"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.891076 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.891389 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9004bae-1fdc-4e97-8c61-f6e070867092-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:20 crc kubenswrapper[4797]: I0930 17:57:20.891596 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fd62\" (UniqueName: \"kubernetes.io/projected/f9004bae-1fdc-4e97-8c61-f6e070867092-kube-api-access-8fd62\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.093449 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxc9q" event={"ID":"f9004bae-1fdc-4e97-8c61-f6e070867092","Type":"ContainerDied","Data":"8073db652baed536f84fadb9cdba5f81c408c31f5491f6081b9744607f10efc7"} Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.094579 4797 scope.go:117] "RemoveContainer" containerID="5d4d9d3bcc32d9709b7a3fcc58ea4565e66fa32a0745a9bd831cb566cbbeeb00" Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.093859 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxc9q" Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.117529 4797 scope.go:117] "RemoveContainer" containerID="6f987c4a320145d3b7657082f918953a09f41ea08391ec82b5f16d57584cab78" Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.128044 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxc9q"] Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.129185 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qxc9q"] Sep 30 17:57:21 crc kubenswrapper[4797]: I0930 17:57:21.146275 4797 scope.go:117] "RemoveContainer" containerID="b8025f874aa383c89557a7ce8dd9fe52eb892408c866d2615f81d336b2836fdf" Sep 30 17:57:22 crc kubenswrapper[4797]: I0930 17:57:22.250284 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" path="/var/lib/kubelet/pods/f9004bae-1fdc-4e97-8c61-f6e070867092/volumes" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.203918 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6cb9fbcf6-fj2qz" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.998081 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-667c9"] Sep 30 17:57:29 crc kubenswrapper[4797]: E0930 17:57:29.998345 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="registry-server" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.998360 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="registry-server" Sep 30 17:57:29 crc kubenswrapper[4797]: E0930 17:57:29.998386 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="extract-utilities" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.998392 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="extract-utilities" Sep 30 17:57:29 crc kubenswrapper[4797]: E0930 17:57:29.998402 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="extract-content" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.998408 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="extract-content" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.998554 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9004bae-1fdc-4e97-8c61-f6e070867092" containerName="registry-server" Sep 30 17:57:29 crc kubenswrapper[4797]: I0930 17:57:29.999041 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.001827 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9q77v" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.007700 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hwndx"] Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.009206 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.010599 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.010672 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-667c9"] Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.013935 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.014574 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.076739 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dsmgz"] Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.077917 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.079979 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.080147 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.080281 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.080955 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-287dp" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.092620 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-kr68l"] Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.093767 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.100223 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.106210 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-kr68l"] Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.111825 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f491bbe4-c848-4384-a932-13d5242e5871-cert\") pod \"frr-k8s-webhook-server-5478bdb765-667c9\" (UID: \"f491bbe4-c848-4384-a932-13d5242e5871\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112108 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjfkg\" (UniqueName: \"kubernetes.io/projected/0e18ed12-faf4-42df-a3f1-97f3c090fa57-kube-api-access-pjfkg\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112242 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-reloader\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112366 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112560 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-sockets\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdb9\" (UniqueName: \"kubernetes.io/projected/f491bbe4-c848-4384-a932-13d5242e5871-kube-api-access-wkdb9\") pod \"frr-k8s-webhook-server-5478bdb765-667c9\" (UID: \"f491bbe4-c848-4384-a932-13d5242e5871\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112781 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-conf\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112826 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics-certs\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.112872 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-startup\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213461 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics-certs\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213509 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213531 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-startup\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f491bbe4-c848-4384-a932-13d5242e5871-cert\") pod \"frr-k8s-webhook-server-5478bdb765-667c9\" (UID: \"f491bbe4-c848-4384-a932-13d5242e5871\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213564 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjfkg\" (UniqueName: \"kubernetes.io/projected/0e18ed12-faf4-42df-a3f1-97f3c090fa57-kube-api-access-pjfkg\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-reloader\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213632 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-sockets\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213659 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k98lz\" (UniqueName: \"kubernetes.io/projected/9e16cc99-60f9-4551-a18e-17f9beeca400-kube-api-access-k98lz\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213691 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-metrics-certs\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213707 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-metrics-certs\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213723 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ld9\" (UniqueName: \"kubernetes.io/projected/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-kube-api-access-t2ld9\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213747 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdb9\" (UniqueName: \"kubernetes.io/projected/f491bbe4-c848-4384-a932-13d5242e5871-kube-api-access-wkdb9\") pod \"frr-k8s-webhook-server-5478bdb765-667c9\" (UID: \"f491bbe4-c848-4384-a932-13d5242e5871\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213761 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-metallb-excludel2\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213775 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-conf\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.213792 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-cert\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.214305 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-conf\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.214353 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.214403 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-sockets\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.214578 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0e18ed12-faf4-42df-a3f1-97f3c090fa57-reloader\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.216179 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.216389 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.218661 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.223987 4797 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.224087 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics-certs podName:0e18ed12-faf4-42df-a3f1-97f3c090fa57 nodeName:}" failed. No retries permitted until 2025-09-30 17:57:30.724063534 +0000 UTC m=+901.246562772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics-certs") pod "frr-k8s-hwndx" (UID: "0e18ed12-faf4-42df-a3f1-97f3c090fa57") : secret "frr-k8s-certs-secret" not found Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.227307 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0e18ed12-faf4-42df-a3f1-97f3c090fa57-frr-startup\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.229891 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f491bbe4-c848-4384-a932-13d5242e5871-cert\") pod \"frr-k8s-webhook-server-5478bdb765-667c9\" (UID: \"f491bbe4-c848-4384-a932-13d5242e5871\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.235275 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjfkg\" (UniqueName: \"kubernetes.io/projected/0e18ed12-faf4-42df-a3f1-97f3c090fa57-kube-api-access-pjfkg\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.248418 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdb9\" (UniqueName: \"kubernetes.io/projected/f491bbe4-c848-4384-a932-13d5242e5871-kube-api-access-wkdb9\") pod \"frr-k8s-webhook-server-5478bdb765-667c9\" (UID: \"f491bbe4-c848-4384-a932-13d5242e5871\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k98lz\" (UniqueName: \"kubernetes.io/projected/9e16cc99-60f9-4551-a18e-17f9beeca400-kube-api-access-k98lz\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315084 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-metrics-certs\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315101 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-metrics-certs\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315121 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2ld9\" (UniqueName: \"kubernetes.io/projected/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-kube-api-access-t2ld9\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315146 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-metallb-excludel2\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315164 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-cert\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.315204 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.317359 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.317567 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.317710 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.317891 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.318023 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.323807 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9q77v" Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.325541 4797 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.325618 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-metrics-certs podName:9e16cc99-60f9-4551-a18e-17f9beeca400 nodeName:}" failed. No retries permitted until 2025-09-30 17:57:30.825598444 +0000 UTC m=+901.348097672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-metrics-certs") pod "controller-5d688f5ffc-kr68l" (UID: "9e16cc99-60f9-4551-a18e-17f9beeca400") : secret "controller-certs-secret" not found Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.325546 4797 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.325655 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist podName:0acb5984-08fc-4f2f-95d1-e65ba209a2f6 nodeName:}" failed. No retries permitted until 2025-09-30 17:57:30.825648765 +0000 UTC m=+901.348148003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist") pod "speaker-dsmgz" (UID: "0acb5984-08fc-4f2f-95d1-e65ba209a2f6") : secret "metallb-memberlist" not found Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.326305 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-metallb-excludel2\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.328403 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-cert\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.328796 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-metrics-certs\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.333267 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.336122 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k98lz\" (UniqueName: \"kubernetes.io/projected/9e16cc99-60f9-4551-a18e-17f9beeca400-kube-api-access-k98lz\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.338394 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2ld9\" (UniqueName: \"kubernetes.io/projected/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-kube-api-access-t2ld9\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.756649 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-667c9"] Sep 30 17:57:30 crc kubenswrapper[4797]: W0930 17:57:30.764596 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf491bbe4_c848_4384_a932_13d5242e5871.slice/crio-9906137b08eff9a725872f0798dab25f4d9e6e5e4d35be4f2c6bb45abbe90d58 WatchSource:0}: Error finding container 9906137b08eff9a725872f0798dab25f4d9e6e5e4d35be4f2c6bb45abbe90d58: Status 404 returned error can't find the container with id 9906137b08eff9a725872f0798dab25f4d9e6e5e4d35be4f2c6bb45abbe90d58 Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.821665 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics-certs\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.828617 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e18ed12-faf4-42df-a3f1-97f3c090fa57-metrics-certs\") pod \"frr-k8s-hwndx\" (UID: \"0e18ed12-faf4-42df-a3f1-97f3c090fa57\") " pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.923668 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-metrics-certs\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.923847 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.923973 4797 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:57:30 crc kubenswrapper[4797]: E0930 17:57:30.924039 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist podName:0acb5984-08fc-4f2f-95d1-e65ba209a2f6 nodeName:}" failed. No retries permitted until 2025-09-30 17:57:31.924023292 +0000 UTC m=+902.446522530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist") pod "speaker-dsmgz" (UID: "0acb5984-08fc-4f2f-95d1-e65ba209a2f6") : secret "metallb-memberlist" not found Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.928879 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e16cc99-60f9-4551-a18e-17f9beeca400-metrics-certs\") pod \"controller-5d688f5ffc-kr68l\" (UID: \"9e16cc99-60f9-4551-a18e-17f9beeca400\") " pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:30 crc kubenswrapper[4797]: I0930 17:57:30.931681 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:31 crc kubenswrapper[4797]: I0930 17:57:31.009495 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:31 crc kubenswrapper[4797]: I0930 17:57:31.162248 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"ef65f123765e64e75ac1ab82877a20d95472cc75e5e0c4265428f7f87fbfc9b6"} Sep 30 17:57:31 crc kubenswrapper[4797]: I0930 17:57:31.163273 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" event={"ID":"f491bbe4-c848-4384-a932-13d5242e5871","Type":"ContainerStarted","Data":"9906137b08eff9a725872f0798dab25f4d9e6e5e4d35be4f2c6bb45abbe90d58"} Sep 30 17:57:31 crc kubenswrapper[4797]: I0930 17:57:31.451661 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-kr68l"] Sep 30 17:57:31 crc kubenswrapper[4797]: I0930 17:57:31.942884 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:31 crc kubenswrapper[4797]: I0930 17:57:31.955486 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0acb5984-08fc-4f2f-95d1-e65ba209a2f6-memberlist\") pod \"speaker-dsmgz\" (UID: \"0acb5984-08fc-4f2f-95d1-e65ba209a2f6\") " pod="metallb-system/speaker-dsmgz" Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.171627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-kr68l" event={"ID":"9e16cc99-60f9-4551-a18e-17f9beeca400","Type":"ContainerStarted","Data":"5c160d3fa69bd33beb12bb401ad1f2bbe07505d357d0fade72b454e434dd20f8"} Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.171709 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-kr68l" event={"ID":"9e16cc99-60f9-4551-a18e-17f9beeca400","Type":"ContainerStarted","Data":"5ac3c5c131082ce11e0e2160b18101ff1d479f0a3dada58f218b0a61b513d7aa"} Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.171738 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-kr68l" event={"ID":"9e16cc99-60f9-4551-a18e-17f9beeca400","Type":"ContainerStarted","Data":"3839c0b4b25c1668bb2a8369db79ec97d779b9b60c3d34df7c7d7e507f793433"} Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.171804 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.194761 4797 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-287dp" Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.196782 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-kr68l" podStartSLOduration=2.196760296 podStartE2EDuration="2.196760296s" podCreationTimestamp="2025-09-30 17:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:57:32.193539729 +0000 UTC m=+902.716038977" watchObservedRunningTime="2025-09-30 17:57:32.196760296 +0000 UTC m=+902.719259534" Sep 30 17:57:32 crc kubenswrapper[4797]: I0930 17:57:32.204333 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dsmgz" Sep 30 17:57:32 crc kubenswrapper[4797]: W0930 17:57:32.261111 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0acb5984_08fc_4f2f_95d1_e65ba209a2f6.slice/crio-3afc742a3988e492b8c77a7d0ebaf71991946a85673ced774360e929505f6d60 WatchSource:0}: Error finding container 3afc742a3988e492b8c77a7d0ebaf71991946a85673ced774360e929505f6d60: Status 404 returned error can't find the container with id 3afc742a3988e492b8c77a7d0ebaf71991946a85673ced774360e929505f6d60 Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.126330 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g75xp"] Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.127838 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.140866 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g75xp"] Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.191218 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dsmgz" event={"ID":"0acb5984-08fc-4f2f-95d1-e65ba209a2f6","Type":"ContainerStarted","Data":"72988073d00c163d3eff5008fba8434eb8521a82ee2be27aa1e48ef0ef70be10"} Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.191274 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dsmgz" event={"ID":"0acb5984-08fc-4f2f-95d1-e65ba209a2f6","Type":"ContainerStarted","Data":"353186822c76f0d8837ce51d3566ebe14c524f1e0a15b05afa46b391cf2ccff0"} Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.191285 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dsmgz" event={"ID":"0acb5984-08fc-4f2f-95d1-e65ba209a2f6","Type":"ContainerStarted","Data":"3afc742a3988e492b8c77a7d0ebaf71991946a85673ced774360e929505f6d60"} Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.191456 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dsmgz" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.217394 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dsmgz" podStartSLOduration=3.217379842 podStartE2EDuration="3.217379842s" podCreationTimestamp="2025-09-30 17:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:57:33.21580485 +0000 UTC m=+903.738304098" watchObservedRunningTime="2025-09-30 17:57:33.217379842 +0000 UTC m=+903.739879080" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.265100 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-catalog-content\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.265147 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59642\" (UniqueName: \"kubernetes.io/projected/b7d7de13-a54b-4adc-ae50-e5e29a44063c-kube-api-access-59642\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.265176 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-utilities\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.366577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-catalog-content\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.366644 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59642\" (UniqueName: \"kubernetes.io/projected/b7d7de13-a54b-4adc-ae50-e5e29a44063c-kube-api-access-59642\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.366672 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-utilities\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.367581 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-catalog-content\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.368229 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-utilities\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.388596 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59642\" (UniqueName: \"kubernetes.io/projected/b7d7de13-a54b-4adc-ae50-e5e29a44063c-kube-api-access-59642\") pod \"redhat-marketplace-g75xp\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:33 crc kubenswrapper[4797]: I0930 17:57:33.488359 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:34 crc kubenswrapper[4797]: I0930 17:57:34.076816 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g75xp"] Sep 30 17:57:34 crc kubenswrapper[4797]: W0930 17:57:34.090807 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7d7de13_a54b_4adc_ae50_e5e29a44063c.slice/crio-0925ad4dbd6dbc39458e953f1daeff79050b01818e62b9f23cdd114d2f6f5da6 WatchSource:0}: Error finding container 0925ad4dbd6dbc39458e953f1daeff79050b01818e62b9f23cdd114d2f6f5da6: Status 404 returned error can't find the container with id 0925ad4dbd6dbc39458e953f1daeff79050b01818e62b9f23cdd114d2f6f5da6 Sep 30 17:57:34 crc kubenswrapper[4797]: I0930 17:57:34.207255 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g75xp" event={"ID":"b7d7de13-a54b-4adc-ae50-e5e29a44063c","Type":"ContainerStarted","Data":"0925ad4dbd6dbc39458e953f1daeff79050b01818e62b9f23cdd114d2f6f5da6"} Sep 30 17:57:35 crc kubenswrapper[4797]: I0930 17:57:35.227047 4797 generic.go:334] "Generic (PLEG): container finished" podID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerID="5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa" exitCode=0 Sep 30 17:57:35 crc kubenswrapper[4797]: I0930 17:57:35.227077 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g75xp" event={"ID":"b7d7de13-a54b-4adc-ae50-e5e29a44063c","Type":"ContainerDied","Data":"5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa"} Sep 30 17:57:36 crc kubenswrapper[4797]: I0930 17:57:36.235407 4797 generic.go:334] "Generic (PLEG): container finished" podID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerID="8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7" exitCode=0 Sep 30 17:57:36 crc kubenswrapper[4797]: I0930 17:57:36.235688 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g75xp" event={"ID":"b7d7de13-a54b-4adc-ae50-e5e29a44063c","Type":"ContainerDied","Data":"8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7"} Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.284330 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g75xp" event={"ID":"b7d7de13-a54b-4adc-ae50-e5e29a44063c","Type":"ContainerStarted","Data":"3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1"} Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.285878 4797 generic.go:334] "Generic (PLEG): container finished" podID="0e18ed12-faf4-42df-a3f1-97f3c090fa57" containerID="2cb8b2266f886296912fb3609c153b2de7e58ea5975f3d7fed29d9a503b20777" exitCode=0 Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.285941 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerDied","Data":"2cb8b2266f886296912fb3609c153b2de7e58ea5975f3d7fed29d9a503b20777"} Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.287495 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" event={"ID":"f491bbe4-c848-4384-a932-13d5242e5871","Type":"ContainerStarted","Data":"e914401873f55a3779dba6f7a5b541496e479f9d6c3a83b6033793355cec67aa"} Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.287640 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.305518 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g75xp" podStartSLOduration=2.465757169 podStartE2EDuration="6.305496689s" podCreationTimestamp="2025-09-30 17:57:33 +0000 UTC" firstStartedPulling="2025-09-30 17:57:35.230873638 +0000 UTC m=+905.753372866" lastFinishedPulling="2025-09-30 17:57:39.070613118 +0000 UTC m=+909.593112386" observedRunningTime="2025-09-30 17:57:39.300763751 +0000 UTC m=+909.823263009" watchObservedRunningTime="2025-09-30 17:57:39.305496689 +0000 UTC m=+909.827995927" Sep 30 17:57:39 crc kubenswrapper[4797]: I0930 17:57:39.324029 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" podStartSLOduration=2.354691987 podStartE2EDuration="10.323994074s" podCreationTimestamp="2025-09-30 17:57:29 +0000 UTC" firstStartedPulling="2025-09-30 17:57:30.767596724 +0000 UTC m=+901.290095982" lastFinishedPulling="2025-09-30 17:57:38.736898821 +0000 UTC m=+909.259398069" observedRunningTime="2025-09-30 17:57:39.319837441 +0000 UTC m=+909.842336689" watchObservedRunningTime="2025-09-30 17:57:39.323994074 +0000 UTC m=+909.846493312" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.299883 4797 generic.go:334] "Generic (PLEG): container finished" podID="0e18ed12-faf4-42df-a3f1-97f3c090fa57" containerID="902f5d78a3e7e9ec53993546c5502946e2d30946edb8aa98a2420a3241583272" exitCode=0 Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.300001 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerDied","Data":"902f5d78a3e7e9ec53993546c5502946e2d30946edb8aa98a2420a3241583272"} Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.514602 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tz2zw"] Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.517042 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.532930 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tz2zw"] Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.678057 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qc5j\" (UniqueName: \"kubernetes.io/projected/814293a2-b8df-483c-9356-931fdf6f1298-kube-api-access-8qc5j\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.678121 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-utilities\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.678165 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-catalog-content\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.779369 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qc5j\" (UniqueName: \"kubernetes.io/projected/814293a2-b8df-483c-9356-931fdf6f1298-kube-api-access-8qc5j\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.779484 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-utilities\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.779548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-catalog-content\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.780057 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-catalog-content\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.780658 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-utilities\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.818058 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qc5j\" (UniqueName: \"kubernetes.io/projected/814293a2-b8df-483c-9356-931fdf6f1298-kube-api-access-8qc5j\") pod \"community-operators-tz2zw\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:40 crc kubenswrapper[4797]: I0930 17:57:40.833694 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:41 crc kubenswrapper[4797]: I0930 17:57:41.025698 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-kr68l" Sep 30 17:57:41 crc kubenswrapper[4797]: I0930 17:57:41.308657 4797 generic.go:334] "Generic (PLEG): container finished" podID="0e18ed12-faf4-42df-a3f1-97f3c090fa57" containerID="11f650110be4c0579bca1be42ee585e8d4ebe92353a6a5638e8321fb8be6a56f" exitCode=0 Sep 30 17:57:41 crc kubenswrapper[4797]: I0930 17:57:41.308699 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerDied","Data":"11f650110be4c0579bca1be42ee585e8d4ebe92353a6a5638e8321fb8be6a56f"} Sep 30 17:57:41 crc kubenswrapper[4797]: I0930 17:57:41.370844 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tz2zw"] Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.209382 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dsmgz" Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.316763 4797 generic.go:334] "Generic (PLEG): container finished" podID="814293a2-b8df-483c-9356-931fdf6f1298" containerID="1bd31256128b2530c0af858e8e3d8c657d5bf35772ea7f1dd3288dd2192a8145" exitCode=0 Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.316959 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerDied","Data":"1bd31256128b2530c0af858e8e3d8c657d5bf35772ea7f1dd3288dd2192a8145"} Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.317046 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerStarted","Data":"d60af088b5ffd448becb501f7fb793178f79224dc3e5fb46aaf5b052c29cf71b"} Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.327284 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"4499ad60000b4ed96d60c053b8a5436a919c5984590664f5b8bc3ba395506593"} Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.327377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"71d35c53a03650d47d5270e12a333534f5bdd96d94ba72937670f11b86b3eefe"} Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.327400 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"5ee7d6276b7800358feae000f264e2a70d172d0fe989064c04799bb8e46084e0"} Sep 30 17:57:42 crc kubenswrapper[4797]: I0930 17:57:42.327415 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"78d0987e8078e9e8f73a69f0070b07d43b056a7256a5fc92a07564f64fc13ccf"} Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.339835 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerStarted","Data":"8aafcbede101daa4a77e017d4a5cb03e3b70d50755bd81217c087aa7519f20ae"} Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.345323 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"8f622aef0c862a6975686a16aacab4d98d7db5c1041d876f2ee30504c1a16586"} Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.345377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hwndx" event={"ID":"0e18ed12-faf4-42df-a3f1-97f3c090fa57","Type":"ContainerStarted","Data":"86690d576b399f80c3a9ff270028b57819b2f6e4cf831087a18fbaa42a65fabd"} Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.345695 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.403349 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hwndx" podStartSLOduration=6.726022672 podStartE2EDuration="14.403330313s" podCreationTimestamp="2025-09-30 17:57:29 +0000 UTC" firstStartedPulling="2025-09-30 17:57:31.092042465 +0000 UTC m=+901.614541713" lastFinishedPulling="2025-09-30 17:57:38.769350116 +0000 UTC m=+909.291849354" observedRunningTime="2025-09-30 17:57:43.393793403 +0000 UTC m=+913.916292641" watchObservedRunningTime="2025-09-30 17:57:43.403330313 +0000 UTC m=+913.925829561" Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.488783 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.489946 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:43 crc kubenswrapper[4797]: I0930 17:57:43.549729 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:44 crc kubenswrapper[4797]: I0930 17:57:44.192875 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:57:44 crc kubenswrapper[4797]: I0930 17:57:44.193290 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:57:44 crc kubenswrapper[4797]: I0930 17:57:44.356503 4797 generic.go:334] "Generic (PLEG): container finished" podID="814293a2-b8df-483c-9356-931fdf6f1298" containerID="8aafcbede101daa4a77e017d4a5cb03e3b70d50755bd81217c087aa7519f20ae" exitCode=0 Sep 30 17:57:44 crc kubenswrapper[4797]: I0930 17:57:44.357751 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerDied","Data":"8aafcbede101daa4a77e017d4a5cb03e3b70d50755bd81217c087aa7519f20ae"} Sep 30 17:57:44 crc kubenswrapper[4797]: I0930 17:57:44.415168 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:45 crc kubenswrapper[4797]: I0930 17:57:45.932575 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:45 crc kubenswrapper[4797]: I0930 17:57:45.982995 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hwndx" Sep 30 17:57:46 crc kubenswrapper[4797]: I0930 17:57:46.375408 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerStarted","Data":"cec738ef6fee476934e564b82afaaeec9847a21af6f6de518bf6b72b7fb1841a"} Sep 30 17:57:46 crc kubenswrapper[4797]: I0930 17:57:46.407202 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tz2zw" podStartSLOduration=3.059928022 podStartE2EDuration="6.407179187s" podCreationTimestamp="2025-09-30 17:57:40 +0000 UTC" firstStartedPulling="2025-09-30 17:57:42.318393351 +0000 UTC m=+912.840892599" lastFinishedPulling="2025-09-30 17:57:45.665644526 +0000 UTC m=+916.188143764" observedRunningTime="2025-09-30 17:57:46.402946993 +0000 UTC m=+916.925446231" watchObservedRunningTime="2025-09-30 17:57:46.407179187 +0000 UTC m=+916.929678465" Sep 30 17:57:46 crc kubenswrapper[4797]: I0930 17:57:46.504172 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g75xp"] Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.382815 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g75xp" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="registry-server" containerID="cri-o://3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1" gracePeriod=2 Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.788228 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.875349 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-utilities\") pod \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.875640 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59642\" (UniqueName: \"kubernetes.io/projected/b7d7de13-a54b-4adc-ae50-e5e29a44063c-kube-api-access-59642\") pod \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.875785 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-catalog-content\") pod \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\" (UID: \"b7d7de13-a54b-4adc-ae50-e5e29a44063c\") " Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.876803 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-utilities" (OuterVolumeSpecName: "utilities") pod "b7d7de13-a54b-4adc-ae50-e5e29a44063c" (UID: "b7d7de13-a54b-4adc-ae50-e5e29a44063c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.880956 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d7de13-a54b-4adc-ae50-e5e29a44063c-kube-api-access-59642" (OuterVolumeSpecName: "kube-api-access-59642") pod "b7d7de13-a54b-4adc-ae50-e5e29a44063c" (UID: "b7d7de13-a54b-4adc-ae50-e5e29a44063c"). InnerVolumeSpecName "kube-api-access-59642". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.894109 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7d7de13-a54b-4adc-ae50-e5e29a44063c" (UID: "b7d7de13-a54b-4adc-ae50-e5e29a44063c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.977162 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.977221 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59642\" (UniqueName: \"kubernetes.io/projected/b7d7de13-a54b-4adc-ae50-e5e29a44063c-kube-api-access-59642\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:47 crc kubenswrapper[4797]: I0930 17:57:47.977235 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d7de13-a54b-4adc-ae50-e5e29a44063c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.393038 4797 generic.go:334] "Generic (PLEG): container finished" podID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerID="3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1" exitCode=0 Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.393101 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g75xp" event={"ID":"b7d7de13-a54b-4adc-ae50-e5e29a44063c","Type":"ContainerDied","Data":"3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1"} Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.393133 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g75xp" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.393151 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g75xp" event={"ID":"b7d7de13-a54b-4adc-ae50-e5e29a44063c","Type":"ContainerDied","Data":"0925ad4dbd6dbc39458e953f1daeff79050b01818e62b9f23cdd114d2f6f5da6"} Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.393184 4797 scope.go:117] "RemoveContainer" containerID="3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.416261 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g75xp"] Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.420780 4797 scope.go:117] "RemoveContainer" containerID="8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.423020 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g75xp"] Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.435748 4797 scope.go:117] "RemoveContainer" containerID="5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.465421 4797 scope.go:117] "RemoveContainer" containerID="3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1" Sep 30 17:57:48 crc kubenswrapper[4797]: E0930 17:57:48.466934 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1\": container with ID starting with 3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1 not found: ID does not exist" containerID="3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.467002 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1"} err="failed to get container status \"3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1\": rpc error: code = NotFound desc = could not find container \"3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1\": container with ID starting with 3aeafacc7482dee0f88b3cbab5aab910caa7e94cd78e879b1796ffde4ecda1c1 not found: ID does not exist" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.467042 4797 scope.go:117] "RemoveContainer" containerID="8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7" Sep 30 17:57:48 crc kubenswrapper[4797]: E0930 17:57:48.467584 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7\": container with ID starting with 8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7 not found: ID does not exist" containerID="8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.467619 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7"} err="failed to get container status \"8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7\": rpc error: code = NotFound desc = could not find container \"8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7\": container with ID starting with 8a8fc3251108248c70f7aacdb7f707e377bacec6583a24b62e30f4d25be980b7 not found: ID does not exist" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.467643 4797 scope.go:117] "RemoveContainer" containerID="5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa" Sep 30 17:57:48 crc kubenswrapper[4797]: E0930 17:57:48.467921 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa\": container with ID starting with 5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa not found: ID does not exist" containerID="5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa" Sep 30 17:57:48 crc kubenswrapper[4797]: I0930 17:57:48.467952 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa"} err="failed to get container status \"5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa\": rpc error: code = NotFound desc = could not find container \"5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa\": container with ID starting with 5aa756f0ea5d210155aa97ed8614aca5bd96ac0631d87cc8abf3ca8ef937c2aa not found: ID does not exist" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.309727 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qkts8"] Sep 30 17:57:49 crc kubenswrapper[4797]: E0930 17:57:49.310135 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="registry-server" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.310166 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="registry-server" Sep 30 17:57:49 crc kubenswrapper[4797]: E0930 17:57:49.310200 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="extract-utilities" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.310214 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="extract-utilities" Sep 30 17:57:49 crc kubenswrapper[4797]: E0930 17:57:49.310235 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="extract-content" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.310246 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="extract-content" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.310519 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" containerName="registry-server" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.311256 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.314098 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jwrqm" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.315020 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.315182 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.315612 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qkts8"] Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.396377 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjfs\" (UniqueName: \"kubernetes.io/projected/43506dea-1282-457a-9201-a2c9f9baa6f3-kube-api-access-ctjfs\") pod \"openstack-operator-index-qkts8\" (UID: \"43506dea-1282-457a-9201-a2c9f9baa6f3\") " pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.497617 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjfs\" (UniqueName: \"kubernetes.io/projected/43506dea-1282-457a-9201-a2c9f9baa6f3-kube-api-access-ctjfs\") pod \"openstack-operator-index-qkts8\" (UID: \"43506dea-1282-457a-9201-a2c9f9baa6f3\") " pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.525245 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjfs\" (UniqueName: \"kubernetes.io/projected/43506dea-1282-457a-9201-a2c9f9baa6f3-kube-api-access-ctjfs\") pod \"openstack-operator-index-qkts8\" (UID: \"43506dea-1282-457a-9201-a2c9f9baa6f3\") " pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:49 crc kubenswrapper[4797]: I0930 17:57:49.646140 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.110283 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qkts8"] Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.247098 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d7de13-a54b-4adc-ae50-e5e29a44063c" path="/var/lib/kubelet/pods/b7d7de13-a54b-4adc-ae50-e5e29a44063c/volumes" Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.338701 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-667c9" Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.408084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qkts8" event={"ID":"43506dea-1282-457a-9201-a2c9f9baa6f3","Type":"ContainerStarted","Data":"ef862a3d54c7c161137cb4e9e8f7b3d95cea062646937cf8a01d9565dbb4e1bc"} Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.834713 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.835166 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:50 crc kubenswrapper[4797]: I0930 17:57:50.892958 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:51 crc kubenswrapper[4797]: I0930 17:57:51.457819 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:53 crc kubenswrapper[4797]: I0930 17:57:53.431839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qkts8" event={"ID":"43506dea-1282-457a-9201-a2c9f9baa6f3","Type":"ContainerStarted","Data":"a3a6a3663f2b094864ba01df1a4a0506d94b5699265c91555376d5529777bbca"} Sep 30 17:57:53 crc kubenswrapper[4797]: I0930 17:57:53.458586 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qkts8" podStartSLOduration=1.361106499 podStartE2EDuration="4.458566316s" podCreationTimestamp="2025-09-30 17:57:49 +0000 UTC" firstStartedPulling="2025-09-30 17:57:50.107497496 +0000 UTC m=+920.629996734" lastFinishedPulling="2025-09-30 17:57:53.204957313 +0000 UTC m=+923.727456551" observedRunningTime="2025-09-30 17:57:53.454846734 +0000 UTC m=+923.977345972" watchObservedRunningTime="2025-09-30 17:57:53.458566316 +0000 UTC m=+923.981065554" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.298874 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tz2zw"] Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.299692 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tz2zw" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="registry-server" containerID="cri-o://cec738ef6fee476934e564b82afaaeec9847a21af6f6de518bf6b72b7fb1841a" gracePeriod=2 Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.449146 4797 generic.go:334] "Generic (PLEG): container finished" podID="814293a2-b8df-483c-9356-931fdf6f1298" containerID="cec738ef6fee476934e564b82afaaeec9847a21af6f6de518bf6b72b7fb1841a" exitCode=0 Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.449192 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerDied","Data":"cec738ef6fee476934e564b82afaaeec9847a21af6f6de518bf6b72b7fb1841a"} Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.779976 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.892997 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-utilities\") pod \"814293a2-b8df-483c-9356-931fdf6f1298\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.893046 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-catalog-content\") pod \"814293a2-b8df-483c-9356-931fdf6f1298\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.893086 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qc5j\" (UniqueName: \"kubernetes.io/projected/814293a2-b8df-483c-9356-931fdf6f1298-kube-api-access-8qc5j\") pod \"814293a2-b8df-483c-9356-931fdf6f1298\" (UID: \"814293a2-b8df-483c-9356-931fdf6f1298\") " Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.894654 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-utilities" (OuterVolumeSpecName: "utilities") pod "814293a2-b8df-483c-9356-931fdf6f1298" (UID: "814293a2-b8df-483c-9356-931fdf6f1298"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.899009 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814293a2-b8df-483c-9356-931fdf6f1298-kube-api-access-8qc5j" (OuterVolumeSpecName: "kube-api-access-8qc5j") pod "814293a2-b8df-483c-9356-931fdf6f1298" (UID: "814293a2-b8df-483c-9356-931fdf6f1298"). InnerVolumeSpecName "kube-api-access-8qc5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.937288 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "814293a2-b8df-483c-9356-931fdf6f1298" (UID: "814293a2-b8df-483c-9356-931fdf6f1298"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.994468 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.994512 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/814293a2-b8df-483c-9356-931fdf6f1298-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:55 crc kubenswrapper[4797]: I0930 17:57:55.994530 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qc5j\" (UniqueName: \"kubernetes.io/projected/814293a2-b8df-483c-9356-931fdf6f1298-kube-api-access-8qc5j\") on node \"crc\" DevicePath \"\"" Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.457656 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz2zw" event={"ID":"814293a2-b8df-483c-9356-931fdf6f1298","Type":"ContainerDied","Data":"d60af088b5ffd448becb501f7fb793178f79224dc3e5fb46aaf5b052c29cf71b"} Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.458060 4797 scope.go:117] "RemoveContainer" containerID="cec738ef6fee476934e564b82afaaeec9847a21af6f6de518bf6b72b7fb1841a" Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.457695 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz2zw" Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.480624 4797 scope.go:117] "RemoveContainer" containerID="8aafcbede101daa4a77e017d4a5cb03e3b70d50755bd81217c087aa7519f20ae" Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.481173 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tz2zw"] Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.490919 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tz2zw"] Sep 30 17:57:56 crc kubenswrapper[4797]: I0930 17:57:56.500638 4797 scope.go:117] "RemoveContainer" containerID="1bd31256128b2530c0af858e8e3d8c657d5bf35772ea7f1dd3288dd2192a8145" Sep 30 17:57:58 crc kubenswrapper[4797]: I0930 17:57:58.252772 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814293a2-b8df-483c-9356-931fdf6f1298" path="/var/lib/kubelet/pods/814293a2-b8df-483c-9356-931fdf6f1298/volumes" Sep 30 17:57:59 crc kubenswrapper[4797]: I0930 17:57:59.647272 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:59 crc kubenswrapper[4797]: I0930 17:57:59.647559 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:57:59 crc kubenswrapper[4797]: I0930 17:57:59.674662 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:58:00 crc kubenswrapper[4797]: I0930 17:58:00.529600 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qkts8" Sep 30 17:58:00 crc kubenswrapper[4797]: I0930 17:58:00.935758 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hwndx" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.737486 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6"] Sep 30 17:58:05 crc kubenswrapper[4797]: E0930 17:58:05.738080 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="extract-content" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.738097 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="extract-content" Sep 30 17:58:05 crc kubenswrapper[4797]: E0930 17:58:05.738119 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="extract-utilities" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.738128 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="extract-utilities" Sep 30 17:58:05 crc kubenswrapper[4797]: E0930 17:58:05.738140 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="registry-server" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.738149 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="registry-server" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.738297 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="814293a2-b8df-483c-9356-931fdf6f1298" containerName="registry-server" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.739551 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.744166 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-r2fh8" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.748111 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6"] Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.842603 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-util\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.842748 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-bundle\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.842811 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxkp\" (UniqueName: \"kubernetes.io/projected/7746f536-ef36-41bd-9f28-94c03952ffde-kube-api-access-mtxkp\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.943729 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-bundle\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.943803 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxkp\" (UniqueName: \"kubernetes.io/projected/7746f536-ef36-41bd-9f28-94c03952ffde-kube-api-access-mtxkp\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.943845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-util\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.944320 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-util\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.944344 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-bundle\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:05 crc kubenswrapper[4797]: I0930 17:58:05.964401 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxkp\" (UniqueName: \"kubernetes.io/projected/7746f536-ef36-41bd-9f28-94c03952ffde-kube-api-access-mtxkp\") pod \"32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:06 crc kubenswrapper[4797]: I0930 17:58:06.068515 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:06 crc kubenswrapper[4797]: I0930 17:58:06.487657 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6"] Sep 30 17:58:06 crc kubenswrapper[4797]: I0930 17:58:06.533497 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" event={"ID":"7746f536-ef36-41bd-9f28-94c03952ffde","Type":"ContainerStarted","Data":"d4e2226f2ce1a70fb52db2b79fa1046787f1c9b2678b830b89a1c0bca8264a0a"} Sep 30 17:58:07 crc kubenswrapper[4797]: I0930 17:58:07.543255 4797 generic.go:334] "Generic (PLEG): container finished" podID="7746f536-ef36-41bd-9f28-94c03952ffde" containerID="0b89729ac2dd92bbbfaaa39b6cc580add52360fe904fd17357242e8789ed65f3" exitCode=0 Sep 30 17:58:07 crc kubenswrapper[4797]: I0930 17:58:07.543391 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" event={"ID":"7746f536-ef36-41bd-9f28-94c03952ffde","Type":"ContainerDied","Data":"0b89729ac2dd92bbbfaaa39b6cc580add52360fe904fd17357242e8789ed65f3"} Sep 30 17:58:08 crc kubenswrapper[4797]: I0930 17:58:08.552294 4797 generic.go:334] "Generic (PLEG): container finished" podID="7746f536-ef36-41bd-9f28-94c03952ffde" containerID="26f51769db97761d64724e460e9496167298b21e8572e4f2f98adb2764b8ee10" exitCode=0 Sep 30 17:58:08 crc kubenswrapper[4797]: I0930 17:58:08.552339 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" event={"ID":"7746f536-ef36-41bd-9f28-94c03952ffde","Type":"ContainerDied","Data":"26f51769db97761d64724e460e9496167298b21e8572e4f2f98adb2764b8ee10"} Sep 30 17:58:09 crc kubenswrapper[4797]: I0930 17:58:09.563919 4797 generic.go:334] "Generic (PLEG): container finished" podID="7746f536-ef36-41bd-9f28-94c03952ffde" containerID="c5aed8d3054e2ed230977042d04f7df804e02d56033325776751e36490670221" exitCode=0 Sep 30 17:58:09 crc kubenswrapper[4797]: I0930 17:58:09.564007 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" event={"ID":"7746f536-ef36-41bd-9f28-94c03952ffde","Type":"ContainerDied","Data":"c5aed8d3054e2ed230977042d04f7df804e02d56033325776751e36490670221"} Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.857396 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.913554 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-bundle\") pod \"7746f536-ef36-41bd-9f28-94c03952ffde\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.913644 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxkp\" (UniqueName: \"kubernetes.io/projected/7746f536-ef36-41bd-9f28-94c03952ffde-kube-api-access-mtxkp\") pod \"7746f536-ef36-41bd-9f28-94c03952ffde\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.913699 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-util\") pod \"7746f536-ef36-41bd-9f28-94c03952ffde\" (UID: \"7746f536-ef36-41bd-9f28-94c03952ffde\") " Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.914933 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-bundle" (OuterVolumeSpecName: "bundle") pod "7746f536-ef36-41bd-9f28-94c03952ffde" (UID: "7746f536-ef36-41bd-9f28-94c03952ffde"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.922226 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7746f536-ef36-41bd-9f28-94c03952ffde-kube-api-access-mtxkp" (OuterVolumeSpecName: "kube-api-access-mtxkp") pod "7746f536-ef36-41bd-9f28-94c03952ffde" (UID: "7746f536-ef36-41bd-9f28-94c03952ffde"). InnerVolumeSpecName "kube-api-access-mtxkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:58:10 crc kubenswrapper[4797]: I0930 17:58:10.936577 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-util" (OuterVolumeSpecName: "util") pod "7746f536-ef36-41bd-9f28-94c03952ffde" (UID: "7746f536-ef36-41bd-9f28-94c03952ffde"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:58:11 crc kubenswrapper[4797]: I0930 17:58:11.015355 4797 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:58:11 crc kubenswrapper[4797]: I0930 17:58:11.015692 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxkp\" (UniqueName: \"kubernetes.io/projected/7746f536-ef36-41bd-9f28-94c03952ffde-kube-api-access-mtxkp\") on node \"crc\" DevicePath \"\"" Sep 30 17:58:11 crc kubenswrapper[4797]: I0930 17:58:11.015816 4797 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7746f536-ef36-41bd-9f28-94c03952ffde-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:58:11 crc kubenswrapper[4797]: I0930 17:58:11.585017 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" event={"ID":"7746f536-ef36-41bd-9f28-94c03952ffde","Type":"ContainerDied","Data":"d4e2226f2ce1a70fb52db2b79fa1046787f1c9b2678b830b89a1c0bca8264a0a"} Sep 30 17:58:11 crc kubenswrapper[4797]: I0930 17:58:11.585081 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e2226f2ce1a70fb52db2b79fa1046787f1c9b2678b830b89a1c0bca8264a0a" Sep 30 17:58:11 crc kubenswrapper[4797]: I0930 17:58:11.585119 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6" Sep 30 17:58:14 crc kubenswrapper[4797]: I0930 17:58:14.192438 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:58:14 crc kubenswrapper[4797]: I0930 17:58:14.192858 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.679218 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh"] Sep 30 17:58:16 crc kubenswrapper[4797]: E0930 17:58:16.681443 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="pull" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.681530 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="pull" Sep 30 17:58:16 crc kubenswrapper[4797]: E0930 17:58:16.681612 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="util" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.681667 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="util" Sep 30 17:58:16 crc kubenswrapper[4797]: E0930 17:58:16.681723 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="extract" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.681777 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="extract" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.681934 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7746f536-ef36-41bd-9f28-94c03952ffde" containerName="extract" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.682613 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.690514 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-pjnnb" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.713529 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh"] Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.792088 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77m58\" (UniqueName: \"kubernetes.io/projected/3192f5d1-b6e4-4471-9adb-24c613a970f4-kube-api-access-77m58\") pod \"openstack-operator-controller-operator-5c6649c9b9-x9zfh\" (UID: \"3192f5d1-b6e4-4471-9adb-24c613a970f4\") " pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.893387 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77m58\" (UniqueName: \"kubernetes.io/projected/3192f5d1-b6e4-4471-9adb-24c613a970f4-kube-api-access-77m58\") pod \"openstack-operator-controller-operator-5c6649c9b9-x9zfh\" (UID: \"3192f5d1-b6e4-4471-9adb-24c613a970f4\") " pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:16 crc kubenswrapper[4797]: I0930 17:58:16.912519 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77m58\" (UniqueName: \"kubernetes.io/projected/3192f5d1-b6e4-4471-9adb-24c613a970f4-kube-api-access-77m58\") pod \"openstack-operator-controller-operator-5c6649c9b9-x9zfh\" (UID: \"3192f5d1-b6e4-4471-9adb-24c613a970f4\") " pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:17 crc kubenswrapper[4797]: I0930 17:58:17.003564 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:17 crc kubenswrapper[4797]: I0930 17:58:17.467764 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh"] Sep 30 17:58:17 crc kubenswrapper[4797]: I0930 17:58:17.623470 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" event={"ID":"3192f5d1-b6e4-4471-9adb-24c613a970f4","Type":"ContainerStarted","Data":"4fa749e3fa9595e2c311bd0c0fc2085d1b9a81a07aa8907f180bacecde6b603d"} Sep 30 17:58:21 crc kubenswrapper[4797]: I0930 17:58:21.647217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" event={"ID":"3192f5d1-b6e4-4471-9adb-24c613a970f4","Type":"ContainerStarted","Data":"9c24bee9afe376afb564774f546028451df44ddb72fb0798ba2b8ff3e3c67450"} Sep 30 17:58:23 crc kubenswrapper[4797]: I0930 17:58:23.666399 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" event={"ID":"3192f5d1-b6e4-4471-9adb-24c613a970f4","Type":"ContainerStarted","Data":"da3eaec2eb05915588927fa2ef64b2c39a2f12280d035f30d0cd717ed95f8ac9"} Sep 30 17:58:23 crc kubenswrapper[4797]: I0930 17:58:23.667127 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:23 crc kubenswrapper[4797]: I0930 17:58:23.701282 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" podStartSLOduration=2.086750123 podStartE2EDuration="7.701264547s" podCreationTimestamp="2025-09-30 17:58:16 +0000 UTC" firstStartedPulling="2025-09-30 17:58:17.474905177 +0000 UTC m=+947.997404415" lastFinishedPulling="2025-09-30 17:58:23.089419601 +0000 UTC m=+953.611918839" observedRunningTime="2025-09-30 17:58:23.699979833 +0000 UTC m=+954.222479111" watchObservedRunningTime="2025-09-30 17:58:23.701264547 +0000 UTC m=+954.223763795" Sep 30 17:58:27 crc kubenswrapper[4797]: I0930 17:58:27.006496 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5c6649c9b9-x9zfh" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.192244 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.192869 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.192915 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.193469 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24310c137eb65af07a384098fa62de96f749a8ebba9db197c7de2ab1bee41304"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.193522 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://24310c137eb65af07a384098fa62de96f749a8ebba9db197c7de2ab1bee41304" gracePeriod=600 Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.280046 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.281235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.286594 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6v2mg" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.295799 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.296975 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.312794 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fj5rf" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.345317 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.355409 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.357358 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.362396 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bd2qv" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.377507 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-585mg"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.378533 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.382445 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-svsjv" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.398407 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.399370 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.403580 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-dt9n6" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.411942 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.416788 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwbg4\" (UniqueName: \"kubernetes.io/projected/481318fa-263c-4a4b-b775-879776670ddb-kube-api-access-mwbg4\") pod \"cinder-operator-controller-manager-644bddb6d8-wvthj\" (UID: \"481318fa-263c-4a4b-b775-879776670ddb\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.417411 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ck9l\" (UniqueName: \"kubernetes.io/projected/3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa-kube-api-access-5ck9l\") pod \"barbican-operator-controller-manager-6ff8b75857-csf68\" (UID: \"3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.420578 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.442240 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-585mg"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.483787 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.505397 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.506938 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.513848 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-sglsx" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.521799 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnmjr\" (UniqueName: \"kubernetes.io/projected/c3c39950-97e6-423c-8884-b65548f38830-kube-api-access-gnmjr\") pod \"heat-operator-controller-manager-5d889d78cf-h2z55\" (UID: \"c3c39950-97e6-423c-8884-b65548f38830\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.521855 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwbg4\" (UniqueName: \"kubernetes.io/projected/481318fa-263c-4a4b-b775-879776670ddb-kube-api-access-mwbg4\") pod \"cinder-operator-controller-manager-644bddb6d8-wvthj\" (UID: \"481318fa-263c-4a4b-b775-879776670ddb\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.521944 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzq5\" (UniqueName: \"kubernetes.io/projected/89b215ec-763f-4eb9-aef0-7f5b1d43481d-kube-api-access-tmzq5\") pod \"designate-operator-controller-manager-84f4f7b77b-85wtx\" (UID: \"89b215ec-763f-4eb9-aef0-7f5b1d43481d\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.521988 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftdd\" (UniqueName: \"kubernetes.io/projected/72a082c8-41b8-4666-bdd1-8f998dedc4c3-kube-api-access-cftdd\") pod \"glance-operator-controller-manager-84958c4d49-585mg\" (UID: \"72a082c8-41b8-4666-bdd1-8f998dedc4c3\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.522020 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ck9l\" (UniqueName: \"kubernetes.io/projected/3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa-kube-api-access-5ck9l\") pod \"barbican-operator-controller-manager-6ff8b75857-csf68\" (UID: \"3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.544980 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.546484 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.548608 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwbg4\" (UniqueName: \"kubernetes.io/projected/481318fa-263c-4a4b-b775-879776670ddb-kube-api-access-mwbg4\") pod \"cinder-operator-controller-manager-644bddb6d8-wvthj\" (UID: \"481318fa-263c-4a4b-b775-879776670ddb\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.548670 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.548976 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-g44fq" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.550777 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ck9l\" (UniqueName: \"kubernetes.io/projected/3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa-kube-api-access-5ck9l\") pod \"barbican-operator-controller-manager-6ff8b75857-csf68\" (UID: \"3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.564129 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.566548 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.570259 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.571283 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.574793 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-27b6j" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.578713 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.579734 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.581496 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ztr7w" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.596152 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.601546 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.607527 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.608799 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.612627 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bxpk8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.612762 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.623005 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftdd\" (UniqueName: \"kubernetes.io/projected/72a082c8-41b8-4666-bdd1-8f998dedc4c3-kube-api-access-cftdd\") pod \"glance-operator-controller-manager-84958c4d49-585mg\" (UID: \"72a082c8-41b8-4666-bdd1-8f998dedc4c3\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.623064 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnmjr\" (UniqueName: \"kubernetes.io/projected/c3c39950-97e6-423c-8884-b65548f38830-kube-api-access-gnmjr\") pod \"heat-operator-controller-manager-5d889d78cf-h2z55\" (UID: \"c3c39950-97e6-423c-8884-b65548f38830\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.623101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlhs\" (UniqueName: \"kubernetes.io/projected/07a9edee-b2ec-48d8-85b3-191f2f29bf73-kube-api-access-jjlhs\") pod \"horizon-operator-controller-manager-9f4696d94-d2mm8\" (UID: \"07a9edee-b2ec-48d8-85b3-191f2f29bf73\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.623155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzq5\" (UniqueName: \"kubernetes.io/projected/89b215ec-763f-4eb9-aef0-7f5b1d43481d-kube-api-access-tmzq5\") pod \"designate-operator-controller-manager-84f4f7b77b-85wtx\" (UID: \"89b215ec-763f-4eb9-aef0-7f5b1d43481d\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.630135 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.632140 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-n564q"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.633328 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.636228 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5ksd9" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.656926 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzq5\" (UniqueName: \"kubernetes.io/projected/89b215ec-763f-4eb9-aef0-7f5b1d43481d-kube-api-access-tmzq5\") pod \"designate-operator-controller-manager-84f4f7b77b-85wtx\" (UID: \"89b215ec-763f-4eb9-aef0-7f5b1d43481d\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.656993 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.657957 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.659986 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftdd\" (UniqueName: \"kubernetes.io/projected/72a082c8-41b8-4666-bdd1-8f998dedc4c3-kube-api-access-cftdd\") pod \"glance-operator-controller-manager-84958c4d49-585mg\" (UID: \"72a082c8-41b8-4666-bdd1-8f998dedc4c3\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.660270 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4gbgt" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.661603 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-n564q"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.666668 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.667842 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnmjr\" (UniqueName: \"kubernetes.io/projected/c3c39950-97e6-423c-8884-b65548f38830-kube-api-access-gnmjr\") pod \"heat-operator-controller-manager-5d889d78cf-h2z55\" (UID: \"c3c39950-97e6-423c-8884-b65548f38830\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.670341 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.671383 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.671962 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.674735 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5gblq" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.685979 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.705338 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.713606 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.718138 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.719334 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.724266 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjntq\" (UniqueName: \"kubernetes.io/projected/8824a3d0-28dc-42eb-b767-b9425f556076-kube-api-access-gjntq\") pod \"keystone-operator-controller-manager-5bd55b4bff-slc6t\" (UID: \"8824a3d0-28dc-42eb-b767-b9425f556076\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.724305 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsht\" (UniqueName: \"kubernetes.io/projected/9f9e430f-f1af-46a5-9885-2e25473d376d-kube-api-access-vrsht\") pod \"ironic-operator-controller-manager-7975b88857-4wpww\" (UID: \"9f9e430f-f1af-46a5-9885-2e25473d376d\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.724336 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-cert\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.724369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5sdv\" (UniqueName: \"kubernetes.io/projected/24d8b2e3-5124-4bac-8cb1-871daabad7e6-kube-api-access-b5sdv\") pod \"manila-operator-controller-manager-6d68dbc695-vlc4g\" (UID: \"24d8b2e3-5124-4bac-8cb1-871daabad7e6\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.724465 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hhl\" (UniqueName: \"kubernetes.io/projected/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-kube-api-access-29hhl\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.724522 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlhs\" (UniqueName: \"kubernetes.io/projected/07a9edee-b2ec-48d8-85b3-191f2f29bf73-kube-api-access-jjlhs\") pod \"horizon-operator-controller-manager-9f4696d94-d2mm8\" (UID: \"07a9edee-b2ec-48d8-85b3-191f2f29bf73\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.725823 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.728412 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ngktv" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.728809 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.731021 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.732256 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.734648 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-pvfh7" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.747589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlhs\" (UniqueName: \"kubernetes.io/projected/07a9edee-b2ec-48d8-85b3-191f2f29bf73-kube-api-access-jjlhs\") pod \"horizon-operator-controller-manager-9f4696d94-d2mm8\" (UID: \"07a9edee-b2ec-48d8-85b3-191f2f29bf73\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.750086 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.751449 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.756109 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.756694 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xppkz" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.757143 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.783137 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.784841 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vlnfz" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjntq\" (UniqueName: \"kubernetes.io/projected/8824a3d0-28dc-42eb-b767-b9425f556076-kube-api-access-gjntq\") pod \"keystone-operator-controller-manager-5bd55b4bff-slc6t\" (UID: \"8824a3d0-28dc-42eb-b767-b9425f556076\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826681 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsht\" (UniqueName: \"kubernetes.io/projected/9f9e430f-f1af-46a5-9885-2e25473d376d-kube-api-access-vrsht\") pod \"ironic-operator-controller-manager-7975b88857-4wpww\" (UID: \"9f9e430f-f1af-46a5-9885-2e25473d376d\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-cert\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826782 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5sdv\" (UniqueName: \"kubernetes.io/projected/24d8b2e3-5124-4bac-8cb1-871daabad7e6-kube-api-access-b5sdv\") pod \"manila-operator-controller-manager-6d68dbc695-vlc4g\" (UID: \"24d8b2e3-5124-4bac-8cb1-871daabad7e6\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826888 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dx4\" (UniqueName: \"kubernetes.io/projected/9508eece-17ea-4b43-9bdb-6c2f8da6e21f-kube-api-access-b5dx4\") pod \"neutron-operator-controller-manager-64d7b59854-qlrv8\" (UID: \"9508eece-17ea-4b43-9bdb-6c2f8da6e21f\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826909 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvfk\" (UniqueName: \"kubernetes.io/projected/65d324d6-26a4-4a59-a29d-a92cad26a07a-kube-api-access-5lvfk\") pod \"mariadb-operator-controller-manager-88c7-n564q\" (UID: \"65d324d6-26a4-4a59-a29d-a92cad26a07a\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.826943 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfmql\" (UniqueName: \"kubernetes.io/projected/3195fff1-53f5-491a-869b-0f7fc5e45df6-kube-api-access-dfmql\") pod \"octavia-operator-controller-manager-76fcc6dc7c-c58zh\" (UID: \"3195fff1-53f5-491a-869b-0f7fc5e45df6\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.827006 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hhl\" (UniqueName: \"kubernetes.io/projected/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-kube-api-access-29hhl\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.827047 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqv5f\" (UniqueName: \"kubernetes.io/projected/04f5a5c7-892a-4aa5-8e21-ec847d9e29fb-kube-api-access-dqv5f\") pod \"nova-operator-controller-manager-c7c776c96-d7r9l\" (UID: \"04f5a5c7-892a-4aa5-8e21-ec847d9e29fb\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:58:44 crc kubenswrapper[4797]: E0930 17:58:44.827391 4797 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 17:58:44 crc kubenswrapper[4797]: E0930 17:58:44.827465 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-cert podName:1c601f06-9978-4f2b-8f37-2fa1bef8e8dd nodeName:}" failed. No retries permitted until 2025-09-30 17:58:45.327446135 +0000 UTC m=+975.849945373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-cert") pod "infra-operator-controller-manager-7d857cc749-9zcc2" (UID: "1c601f06-9978-4f2b-8f37-2fa1bef8e8dd") : secret "infra-operator-webhook-server-cert" not found Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.827506 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.838852 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.843097 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.848597 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.849687 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="24310c137eb65af07a384098fa62de96f749a8ebba9db197c7de2ab1bee41304" exitCode=0 Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.849766 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"24310c137eb65af07a384098fa62de96f749a8ebba9db197c7de2ab1bee41304"} Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.849832 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"f48f1c21375195dc21057b3f6223be6922c12af920895f2ea187fde9415ae1df"} Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.849854 4797 scope.go:117] "RemoveContainer" containerID="ce8c5c7ec3f2afcaab85363de392623ca1e51f3441e1f3e66b88c01887d0f151" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.850674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsht\" (UniqueName: \"kubernetes.io/projected/9f9e430f-f1af-46a5-9885-2e25473d376d-kube-api-access-vrsht\") pod \"ironic-operator-controller-manager-7975b88857-4wpww\" (UID: \"9f9e430f-f1af-46a5-9885-2e25473d376d\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.852850 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjntq\" (UniqueName: \"kubernetes.io/projected/8824a3d0-28dc-42eb-b767-b9425f556076-kube-api-access-gjntq\") pod \"keystone-operator-controller-manager-5bd55b4bff-slc6t\" (UID: \"8824a3d0-28dc-42eb-b767-b9425f556076\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.869556 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.869580 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hhl\" (UniqueName: \"kubernetes.io/projected/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-kube-api-access-29hhl\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.870663 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.871100 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5sdv\" (UniqueName: \"kubernetes.io/projected/24d8b2e3-5124-4bac-8cb1-871daabad7e6-kube-api-access-b5sdv\") pod \"manila-operator-controller-manager-6d68dbc695-vlc4g\" (UID: \"24d8b2e3-5124-4bac-8cb1-871daabad7e6\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.872796 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bppq4" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.893459 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.903106 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.904925 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.915672 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4tdxb" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.919708 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928044 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dx4\" (UniqueName: \"kubernetes.io/projected/9508eece-17ea-4b43-9bdb-6c2f8da6e21f-kube-api-access-b5dx4\") pod \"neutron-operator-controller-manager-64d7b59854-qlrv8\" (UID: \"9508eece-17ea-4b43-9bdb-6c2f8da6e21f\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928086 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5dpr\" (UniqueName: \"kubernetes.io/projected/d69ffa93-8979-4922-8aee-7ea26fede6b4-kube-api-access-x5dpr\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928110 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d69ffa93-8979-4922-8aee-7ea26fede6b4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvfk\" (UniqueName: \"kubernetes.io/projected/65d324d6-26a4-4a59-a29d-a92cad26a07a-kube-api-access-5lvfk\") pod \"mariadb-operator-controller-manager-88c7-n564q\" (UID: \"65d324d6-26a4-4a59-a29d-a92cad26a07a\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928162 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfmql\" (UniqueName: \"kubernetes.io/projected/3195fff1-53f5-491a-869b-0f7fc5e45df6-kube-api-access-dfmql\") pod \"octavia-operator-controller-manager-76fcc6dc7c-c58zh\" (UID: \"3195fff1-53f5-491a-869b-0f7fc5e45df6\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928202 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhqv\" (UniqueName: \"kubernetes.io/projected/05a4965a-f8d6-4859-ab5d-87773f6f6981-kube-api-access-9jhqv\") pod \"ovn-operator-controller-manager-9976ff44c-dbxcb\" (UID: \"05a4965a-f8d6-4859-ab5d-87773f6f6981\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928221 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqv5f\" (UniqueName: \"kubernetes.io/projected/04f5a5c7-892a-4aa5-8e21-ec847d9e29fb-kube-api-access-dqv5f\") pod \"nova-operator-controller-manager-c7c776c96-d7r9l\" (UID: \"04f5a5c7-892a-4aa5-8e21-ec847d9e29fb\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.928240 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmx7w\" (UniqueName: \"kubernetes.io/projected/8a1ceaa0-b6e6-442d-84d0-3fba075b136c-kube-api-access-wmx7w\") pod \"placement-operator-controller-manager-589c58c6c-xh5rb\" (UID: \"8a1ceaa0-b6e6-442d-84d0-3fba075b136c\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.932068 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.946118 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.950271 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfmql\" (UniqueName: \"kubernetes.io/projected/3195fff1-53f5-491a-869b-0f7fc5e45df6-kube-api-access-dfmql\") pod \"octavia-operator-controller-manager-76fcc6dc7c-c58zh\" (UID: \"3195fff1-53f5-491a-869b-0f7fc5e45df6\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.957665 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvfk\" (UniqueName: \"kubernetes.io/projected/65d324d6-26a4-4a59-a29d-a92cad26a07a-kube-api-access-5lvfk\") pod \"mariadb-operator-controller-manager-88c7-n564q\" (UID: \"65d324d6-26a4-4a59-a29d-a92cad26a07a\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.961615 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqv5f\" (UniqueName: \"kubernetes.io/projected/04f5a5c7-892a-4aa5-8e21-ec847d9e29fb-kube-api-access-dqv5f\") pod \"nova-operator-controller-manager-c7c776c96-d7r9l\" (UID: \"04f5a5c7-892a-4aa5-8e21-ec847d9e29fb\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.962866 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6"] Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.964852 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dx4\" (UniqueName: \"kubernetes.io/projected/9508eece-17ea-4b43-9bdb-6c2f8da6e21f-kube-api-access-b5dx4\") pod \"neutron-operator-controller-manager-64d7b59854-qlrv8\" (UID: \"9508eece-17ea-4b43-9bdb-6c2f8da6e21f\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:58:44 crc kubenswrapper[4797]: I0930 17:58:44.996073 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.007590 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-66ff6"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.008783 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.011222 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bt5zd" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.014851 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.028937 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5dpr\" (UniqueName: \"kubernetes.io/projected/d69ffa93-8979-4922-8aee-7ea26fede6b4-kube-api-access-x5dpr\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.028986 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d69ffa93-8979-4922-8aee-7ea26fede6b4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.029076 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhqv\" (UniqueName: \"kubernetes.io/projected/05a4965a-f8d6-4859-ab5d-87773f6f6981-kube-api-access-9jhqv\") pod \"ovn-operator-controller-manager-9976ff44c-dbxcb\" (UID: \"05a4965a-f8d6-4859-ab5d-87773f6f6981\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.029111 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cckbk\" (UniqueName: \"kubernetes.io/projected/ff0b391c-ac01-4a17-9381-a1e2b00d044d-kube-api-access-cckbk\") pod \"telemetry-operator-controller-manager-b8d54b5d7-5vhf6\" (UID: \"ff0b391c-ac01-4a17-9381-a1e2b00d044d\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.029142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmx7w\" (UniqueName: \"kubernetes.io/projected/8a1ceaa0-b6e6-442d-84d0-3fba075b136c-kube-api-access-wmx7w\") pod \"placement-operator-controller-manager-589c58c6c-xh5rb\" (UID: \"8a1ceaa0-b6e6-442d-84d0-3fba075b136c\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.029170 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z4rx\" (UniqueName: \"kubernetes.io/projected/c9c41380-c9ee-4467-b343-0f6cf78d51bc-kube-api-access-7z4rx\") pod \"swift-operator-controller-manager-bc7dc7bd9-grp8f\" (UID: \"c9c41380-c9ee-4467-b343-0f6cf78d51bc\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:58:45 crc kubenswrapper[4797]: E0930 17:58:45.030090 4797 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:58:45 crc kubenswrapper[4797]: E0930 17:58:45.030132 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d69ffa93-8979-4922-8aee-7ea26fede6b4-cert podName:d69ffa93-8979-4922-8aee-7ea26fede6b4 nodeName:}" failed. No retries permitted until 2025-09-30 17:58:45.530117267 +0000 UTC m=+976.052616505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d69ffa93-8979-4922-8aee-7ea26fede6b4-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-zj2w6" (UID: "d69ffa93-8979-4922-8aee-7ea26fede6b4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.031344 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.047258 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.053870 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5dpr\" (UniqueName: \"kubernetes.io/projected/d69ffa93-8979-4922-8aee-7ea26fede6b4-kube-api-access-x5dpr\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.054272 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmx7w\" (UniqueName: \"kubernetes.io/projected/8a1ceaa0-b6e6-442d-84d0-3fba075b136c-kube-api-access-wmx7w\") pod \"placement-operator-controller-manager-589c58c6c-xh5rb\" (UID: \"8a1ceaa0-b6e6-442d-84d0-3fba075b136c\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.054607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhqv\" (UniqueName: \"kubernetes.io/projected/05a4965a-f8d6-4859-ab5d-87773f6f6981-kube-api-access-9jhqv\") pod \"ovn-operator-controller-manager-9976ff44c-dbxcb\" (UID: \"05a4965a-f8d6-4859-ab5d-87773f6f6981\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.066688 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-66ff6"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.085729 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.116562 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.117711 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.119470 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4m9nd" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.126801 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.132072 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z4rx\" (UniqueName: \"kubernetes.io/projected/c9c41380-c9ee-4467-b343-0f6cf78d51bc-kube-api-access-7z4rx\") pod \"swift-operator-controller-manager-bc7dc7bd9-grp8f\" (UID: \"c9c41380-c9ee-4467-b343-0f6cf78d51bc\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.132194 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrvv\" (UniqueName: \"kubernetes.io/projected/508a8f28-1d71-43ac-b24b-65f226abf807-kube-api-access-qxrvv\") pod \"watcher-operator-controller-manager-598db9dcc9-jbsh8\" (UID: \"508a8f28-1d71-43ac-b24b-65f226abf807\") " pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.132223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c5sz\" (UniqueName: \"kubernetes.io/projected/be859676-32a8-4144-94fd-ab0da94ce6bc-kube-api-access-6c5sz\") pod \"test-operator-controller-manager-f66b554c6-66ff6\" (UID: \"be859676-32a8-4144-94fd-ab0da94ce6bc\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.132256 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cckbk\" (UniqueName: \"kubernetes.io/projected/ff0b391c-ac01-4a17-9381-a1e2b00d044d-kube-api-access-cckbk\") pod \"telemetry-operator-controller-manager-b8d54b5d7-5vhf6\" (UID: \"ff0b391c-ac01-4a17-9381-a1e2b00d044d\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.135614 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.154606 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.156532 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.156715 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.157005 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cckbk\" (UniqueName: \"kubernetes.io/projected/ff0b391c-ac01-4a17-9381-a1e2b00d044d-kube-api-access-cckbk\") pod \"telemetry-operator-controller-manager-b8d54b5d7-5vhf6\" (UID: \"ff0b391c-ac01-4a17-9381-a1e2b00d044d\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.158480 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.158719 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9b5b2" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.161386 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z4rx\" (UniqueName: \"kubernetes.io/projected/c9c41380-c9ee-4467-b343-0f6cf78d51bc-kube-api-access-7z4rx\") pod \"swift-operator-controller-manager-bc7dc7bd9-grp8f\" (UID: \"c9c41380-c9ee-4467-b343-0f6cf78d51bc\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.172148 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.172972 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.175684 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6jtk7" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.175900 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.234243 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrvv\" (UniqueName: \"kubernetes.io/projected/508a8f28-1d71-43ac-b24b-65f226abf807-kube-api-access-qxrvv\") pod \"watcher-operator-controller-manager-598db9dcc9-jbsh8\" (UID: \"508a8f28-1d71-43ac-b24b-65f226abf807\") " pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.234890 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5sz\" (UniqueName: \"kubernetes.io/projected/be859676-32a8-4144-94fd-ab0da94ce6bc-kube-api-access-6c5sz\") pod \"test-operator-controller-manager-f66b554c6-66ff6\" (UID: \"be859676-32a8-4144-94fd-ab0da94ce6bc\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.256607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrvv\" (UniqueName: \"kubernetes.io/projected/508a8f28-1d71-43ac-b24b-65f226abf807-kube-api-access-qxrvv\") pod \"watcher-operator-controller-manager-598db9dcc9-jbsh8\" (UID: \"508a8f28-1d71-43ac-b24b-65f226abf807\") " pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.258365 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.267346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.269507 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5sz\" (UniqueName: \"kubernetes.io/projected/be859676-32a8-4144-94fd-ab0da94ce6bc-kube-api-access-6c5sz\") pod \"test-operator-controller-manager-f66b554c6-66ff6\" (UID: \"be859676-32a8-4144-94fd-ab0da94ce6bc\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.314874 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.342629 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.343048 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-cert\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.343126 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c62np\" (UniqueName: \"kubernetes.io/projected/ca63e090-a37a-4150-9c58-edf133c74c99-kube-api-access-c62np\") pod \"rabbitmq-cluster-operator-manager-79d8469568-gcwrs\" (UID: \"ca63e090-a37a-4150-9c58-edf133c74c99\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.343165 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.343251 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72kp\" (UniqueName: \"kubernetes.io/projected/a618912d-66f8-4486-8e69-d3dc16f3cb34-kube-api-access-j72kp\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.350930 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c601f06-9978-4f2b-8f37-2fa1bef8e8dd-cert\") pod \"infra-operator-controller-manager-7d857cc749-9zcc2\" (UID: \"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.368236 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.452158 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72kp\" (UniqueName: \"kubernetes.io/projected/a618912d-66f8-4486-8e69-d3dc16f3cb34-kube-api-access-j72kp\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.452250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c62np\" (UniqueName: \"kubernetes.io/projected/ca63e090-a37a-4150-9c58-edf133c74c99-kube-api-access-c62np\") pod \"rabbitmq-cluster-operator-manager-79d8469568-gcwrs\" (UID: \"ca63e090-a37a-4150-9c58-edf133c74c99\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.452273 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: E0930 17:58:45.452483 4797 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 17:58:45 crc kubenswrapper[4797]: E0930 17:58:45.452534 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert podName:a618912d-66f8-4486-8e69-d3dc16f3cb34 nodeName:}" failed. No retries permitted until 2025-09-30 17:58:45.952520207 +0000 UTC m=+976.475019445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert") pod "openstack-operator-controller-manager-5dc7c668c-46p2f" (UID: "a618912d-66f8-4486-8e69-d3dc16f3cb34") : secret "webhook-server-cert" not found Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.492340 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72kp\" (UniqueName: \"kubernetes.io/projected/a618912d-66f8-4486-8e69-d3dc16f3cb34-kube-api-access-j72kp\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.499198 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c62np\" (UniqueName: \"kubernetes.io/projected/ca63e090-a37a-4150-9c58-edf133c74c99-kube-api-access-c62np\") pod \"rabbitmq-cluster-operator-manager-79d8469568-gcwrs\" (UID: \"ca63e090-a37a-4150-9c58-edf133c74c99\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.501759 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.534498 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.554054 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d69ffa93-8979-4922-8aee-7ea26fede6b4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.563068 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d69ffa93-8979-4922-8aee-7ea26fede6b4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-zj2w6\" (UID: \"d69ffa93-8979-4922-8aee-7ea26fede6b4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.604319 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.712947 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.859681 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" event={"ID":"3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa","Type":"ContainerStarted","Data":"4240af17c617e8aeba5972977bcecbcfc7d2b2cb7df30c9b78f96823b95cb6d6"} Sep 30 17:58:45 crc kubenswrapper[4797]: W0930 17:58:45.912488 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod481318fa_263c_4a4b_b775_879776670ddb.slice/crio-030aa6c3703a739be0c140cf6f1ffa6ea275af5ec7cfe4f40878c7564d7a60cb WatchSource:0}: Error finding container 030aa6c3703a739be0c140cf6f1ffa6ea275af5ec7cfe4f40878c7564d7a60cb: Status 404 returned error can't find the container with id 030aa6c3703a739be0c140cf6f1ffa6ea275af5ec7cfe4f40878c7564d7a60cb Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.914963 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55"] Sep 30 17:58:45 crc kubenswrapper[4797]: W0930 17:58:45.915454 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c39950_97e6_423c_8884_b65548f38830.slice/crio-605617c442521ffc96d10c7ce209e1f6b634b065c5aec1b825d4ae3ca59e8f0c WatchSource:0}: Error finding container 605617c442521ffc96d10c7ce209e1f6b634b065c5aec1b825d4ae3ca59e8f0c: Status 404 returned error can't find the container with id 605617c442521ffc96d10c7ce209e1f6b634b065c5aec1b825d4ae3ca59e8f0c Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.920379 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.953340 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-585mg"] Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.957570 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:45 crc kubenswrapper[4797]: E0930 17:58:45.957812 4797 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 17:58:45 crc kubenswrapper[4797]: E0930 17:58:45.960150 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert podName:a618912d-66f8-4486-8e69-d3dc16f3cb34 nodeName:}" failed. No retries permitted until 2025-09-30 17:58:46.960124842 +0000 UTC m=+977.482624080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert") pod "openstack-operator-controller-manager-5dc7c668c-46p2f" (UID: "a618912d-66f8-4486-8e69-d3dc16f3cb34") : secret "webhook-server-cert" not found Sep 30 17:58:45 crc kubenswrapper[4797]: I0930 17:58:45.967196 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8"] Sep 30 17:58:45 crc kubenswrapper[4797]: W0930 17:58:45.969309 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a9edee_b2ec_48d8_85b3_191f2f29bf73.slice/crio-904896c173ebd525e2810d10b40e2fb8416c10bb390074baafbb8cbbdc442207 WatchSource:0}: Error finding container 904896c173ebd525e2810d10b40e2fb8416c10bb390074baafbb8cbbdc442207: Status 404 returned error can't find the container with id 904896c173ebd525e2810d10b40e2fb8416c10bb390074baafbb8cbbdc442207 Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.025515 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.034232 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-n564q"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.038415 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.130352 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.136287 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.140003 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l"] Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.145621 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24d8b2e3_5124_4bac_8cb1_871daabad7e6.slice/crio-adb3e2a8143c4d67c1976dd9f4395aaa12aff3bf74bfae5456032b6cc70b2789 WatchSource:0}: Error finding container adb3e2a8143c4d67c1976dd9f4395aaa12aff3bf74bfae5456032b6cc70b2789: Status 404 returned error can't find the container with id adb3e2a8143c4d67c1976dd9f4395aaa12aff3bf74bfae5456032b6cc70b2789 Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.148758 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3195fff1_53f5_491a_869b_0f7fc5e45df6.slice/crio-06b3f97f8c287ea409e5785c44f84de4d96b00136fa05610a69a9d6e56734e5e WatchSource:0}: Error finding container 06b3f97f8c287ea409e5785c44f84de4d96b00136fa05610a69a9d6e56734e5e: Status 404 returned error can't find the container with id 06b3f97f8c287ea409e5785c44f84de4d96b00136fa05610a69a9d6e56734e5e Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.166568 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t"] Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.166803 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8824a3d0_28dc_42eb_b767_b9425f556076.slice/crio-b57eca4dec5412f8b5e99152a6f42a9ad749a541242e6861debd948d538aa043 WatchSource:0}: Error finding container b57eca4dec5412f8b5e99152a6f42a9ad749a541242e6861debd948d538aa043: Status 404 returned error can't find the container with id b57eca4dec5412f8b5e99152a6f42a9ad749a541242e6861debd948d538aa043 Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.509738 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-66ff6"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.516596 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.523907 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f"] Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.527169 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe859676_32a8_4144_94fd_ab0da94ce6bc.slice/crio-231ae958825ef8a3aad202bd2324c6e2e357b0727927918b05040a0475bfc313 WatchSource:0}: Error finding container 231ae958825ef8a3aad202bd2324c6e2e357b0727927918b05040a0475bfc313: Status 404 returned error can't find the container with id 231ae958825ef8a3aad202bd2324c6e2e357b0727927918b05040a0475bfc313 Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.536749 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2"] Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.539972 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c41380_c9ee_4467_b343_0f6cf78d51bc.slice/crio-ec0fc2dfbd57c80db055130ad38aad2448215ceaa1e52ace9ad5f10055b1ee06 WatchSource:0}: Error finding container ec0fc2dfbd57c80db055130ad38aad2448215ceaa1e52ace9ad5f10055b1ee06: Status 404 returned error can't find the container with id ec0fc2dfbd57c80db055130ad38aad2448215ceaa1e52ace9ad5f10055b1ee06 Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.542322 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.548303 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.557122 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs"] Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.565228 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb"] Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.565660 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cckbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-5vhf6_openstack-operators(ff0b391c-ac01-4a17-9381-a1e2b00d044d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.567693 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.120:5001/openstack-k8s-operators/watcher-operator:ddbfeae56acf552d63a96c2e6f420cfbcebbdcf9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxrvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-598db9dcc9-jbsh8_openstack-operators(508a8f28-1d71-43ac-b24b-65f226abf807): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.568794 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9508eece_17ea_4b43_9bdb_6c2f8da6e21f.slice/crio-1c9db1cec5cc1d1164c98cb63757f9adaa218100e5dd0327dc277946d4e86c1b WatchSource:0}: Error finding container 1c9db1cec5cc1d1164c98cb63757f9adaa218100e5dd0327dc277946d4e86c1b: Status 404 returned error can't find the container with id 1c9db1cec5cc1d1164c98cb63757f9adaa218100e5dd0327dc277946d4e86c1b Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.570698 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c601f06_9978_4f2b_8f37_2fa1bef8e8dd.slice/crio-c7879f8280b998412707530809ce6941db7c8a262699f8b86aacf37248adce4c WatchSource:0}: Error finding container c7879f8280b998412707530809ce6941db7c8a262699f8b86aacf37248adce4c: Status 404 returned error can't find the container with id c7879f8280b998412707530809ce6941db7c8a262699f8b86aacf37248adce4c Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.572530 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b5dx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64d7b59854-qlrv8_openstack-operators(9508eece-17ea-4b43-9bdb-6c2f8da6e21f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.572613 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb"] Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.574396 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-29hhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d857cc749-9zcc2_openstack-operators(1c601f06-9978-4f2b-8f37-2fa1bef8e8dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.579553 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6"] Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.605962 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca63e090_a37a_4150_9c58_edf133c74c99.slice/crio-51d9cea0fc16d4a59bbc25610c6a391321043fe97c2f9ffc14269c64f9e4fbc9 WatchSource:0}: Error finding container 51d9cea0fc16d4a59bbc25610c6a391321043fe97c2f9ffc14269c64f9e4fbc9: Status 404 returned error can't find the container with id 51d9cea0fc16d4a59bbc25610c6a391321043fe97c2f9ffc14269c64f9e4fbc9 Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.607969 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c62np,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-gcwrs_openstack-operators(ca63e090-a37a-4150-9c58-edf133c74c99): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.609642 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" podUID="ca63e090-a37a-4150-9c58-edf133c74c99" Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.612621 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a4965a_f8d6_4859_ab5d_87773f6f6981.slice/crio-dd4758eff9b99ed021df9ad65e896e5e6c70edbbd57c653270a91b826f20136f WatchSource:0}: Error finding container dd4758eff9b99ed021df9ad65e896e5e6c70edbbd57c653270a91b826f20136f: Status 404 returned error can't find the container with id dd4758eff9b99ed021df9ad65e896e5e6c70edbbd57c653270a91b826f20136f Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.616415 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9jhqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-dbxcb_openstack-operators(05a4965a-f8d6-4859-ab5d-87773f6f6981): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.617118 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1ceaa0_b6e6_442d_84d0_3fba075b136c.slice/crio-2b5fea8cd1ee91744de361a1ad153201908a1323d578dad68c28fec16dbb2f96 WatchSource:0}: Error finding container 2b5fea8cd1ee91744de361a1ad153201908a1323d578dad68c28fec16dbb2f96: Status 404 returned error can't find the container with id 2b5fea8cd1ee91744de361a1ad153201908a1323d578dad68c28fec16dbb2f96 Sep 30 17:58:46 crc kubenswrapper[4797]: W0930 17:58:46.620073 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69ffa93_8979_4922_8aee_7ea26fede6b4.slice/crio-bc64701553e6ae379033ecf84e666cdd5fb83ec98c1258df778c44deb8ae1562 WatchSource:0}: Error finding container bc64701553e6ae379033ecf84e666cdd5fb83ec98c1258df778c44deb8ae1562: Status 404 returned error can't find the container with id bc64701553e6ae379033ecf84e666cdd5fb83ec98c1258df778c44deb8ae1562 Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.630288 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wmx7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-xh5rb_openstack-operators(8a1ceaa0-b6e6-442d-84d0-3fba075b136c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.630367 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5dpr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-zj2w6_openstack-operators(d69ffa93-8979-4922-8aee-7ea26fede6b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.770154 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" podUID="ff0b391c-ac01-4a17-9381-a1e2b00d044d" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.807182 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" podUID="508a8f28-1d71-43ac-b24b-65f226abf807" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.824672 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" podUID="1c601f06-9978-4f2b-8f37-2fa1bef8e8dd" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.853265 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" podUID="05a4965a-f8d6-4859-ab5d-87773f6f6981" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.874469 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" event={"ID":"72a082c8-41b8-4666-bdd1-8f998dedc4c3","Type":"ContainerStarted","Data":"82ea1e96dcde9cd3f6e83fc519f4311493c49c5af702433afcf93c6e50cc6bc7"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.877692 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" event={"ID":"ff0b391c-ac01-4a17-9381-a1e2b00d044d","Type":"ContainerStarted","Data":"82dd700327aede4342ba05cb99498810df4b884e8647e85e3567f43948f66cd5"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.877724 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" event={"ID":"ff0b391c-ac01-4a17-9381-a1e2b00d044d","Type":"ContainerStarted","Data":"091ba0a1437b16753beac3cdce7abba61a73a004ea79f3e727505b00be5a1bb4"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.879364 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" podUID="ff0b391c-ac01-4a17-9381-a1e2b00d044d" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.880423 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" event={"ID":"be859676-32a8-4144-94fd-ab0da94ce6bc","Type":"ContainerStarted","Data":"231ae958825ef8a3aad202bd2324c6e2e357b0727927918b05040a0475bfc313"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.882295 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" event={"ID":"07a9edee-b2ec-48d8-85b3-191f2f29bf73","Type":"ContainerStarted","Data":"904896c173ebd525e2810d10b40e2fb8416c10bb390074baafbb8cbbdc442207"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.890969 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" event={"ID":"9f9e430f-f1af-46a5-9885-2e25473d376d","Type":"ContainerStarted","Data":"d032dcc3a98a67876a2ed9b772598c5d34c1dd643e740c84f77f5faf565c1d13"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.892171 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" event={"ID":"ca63e090-a37a-4150-9c58-edf133c74c99","Type":"ContainerStarted","Data":"51d9cea0fc16d4a59bbc25610c6a391321043fe97c2f9ffc14269c64f9e4fbc9"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.893639 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" podUID="ca63e090-a37a-4150-9c58-edf133c74c99" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.895847 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" event={"ID":"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd","Type":"ContainerStarted","Data":"bcf200965dc249d1d23bb107f73b4b8ea4d3e450a977bc86e6c7974b3e87febe"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.895877 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" event={"ID":"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd","Type":"ContainerStarted","Data":"c7879f8280b998412707530809ce6941db7c8a262699f8b86aacf37248adce4c"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.898698 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" event={"ID":"05a4965a-f8d6-4859-ab5d-87773f6f6981","Type":"ContainerStarted","Data":"4d9bb9fcb8a4581d3fc25b6d3df20680137335205cbf0d20a180cfb508fe2acb"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.898722 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" event={"ID":"05a4965a-f8d6-4859-ab5d-87773f6f6981","Type":"ContainerStarted","Data":"dd4758eff9b99ed021df9ad65e896e5e6c70edbbd57c653270a91b826f20136f"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.899356 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" podUID="1c601f06-9978-4f2b-8f37-2fa1bef8e8dd" Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.900469 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" podUID="05a4965a-f8d6-4859-ab5d-87773f6f6981" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.925904 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" event={"ID":"d69ffa93-8979-4922-8aee-7ea26fede6b4","Type":"ContainerStarted","Data":"bc64701553e6ae379033ecf84e666cdd5fb83ec98c1258df778c44deb8ae1562"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.928258 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" event={"ID":"3195fff1-53f5-491a-869b-0f7fc5e45df6","Type":"ContainerStarted","Data":"06b3f97f8c287ea409e5785c44f84de4d96b00136fa05610a69a9d6e56734e5e"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.929789 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" podUID="d69ffa93-8979-4922-8aee-7ea26fede6b4" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.930162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" event={"ID":"481318fa-263c-4a4b-b775-879776670ddb","Type":"ContainerStarted","Data":"030aa6c3703a739be0c140cf6f1ffa6ea275af5ec7cfe4f40878c7564d7a60cb"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.931211 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" event={"ID":"65d324d6-26a4-4a59-a29d-a92cad26a07a","Type":"ContainerStarted","Data":"d2f2859c5017fd090600e6bfef4dba61c41ba82c3bc62f2470d993af798fe40f"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.935247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" event={"ID":"9508eece-17ea-4b43-9bdb-6c2f8da6e21f","Type":"ContainerStarted","Data":"1c9db1cec5cc1d1164c98cb63757f9adaa218100e5dd0327dc277946d4e86c1b"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.947743 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" podUID="9508eece-17ea-4b43-9bdb-6c2f8da6e21f" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.948777 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" event={"ID":"8824a3d0-28dc-42eb-b767-b9425f556076","Type":"ContainerStarted","Data":"b57eca4dec5412f8b5e99152a6f42a9ad749a541242e6861debd948d538aa043"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.950966 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" event={"ID":"24d8b2e3-5124-4bac-8cb1-871daabad7e6","Type":"ContainerStarted","Data":"adb3e2a8143c4d67c1976dd9f4395aaa12aff3bf74bfae5456032b6cc70b2789"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.951912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" event={"ID":"04f5a5c7-892a-4aa5-8e21-ec847d9e29fb","Type":"ContainerStarted","Data":"13c61918a08392d74f66a47d347d33a781649122870aed79c54dafe9ddb3bfb8"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.953625 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" event={"ID":"89b215ec-763f-4eb9-aef0-7f5b1d43481d","Type":"ContainerStarted","Data":"6b8d8ada8dcdc2aa2c7226ddb11ca29c8405968b98ffa2180cc2cb662a530695"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.955292 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" event={"ID":"8a1ceaa0-b6e6-442d-84d0-3fba075b136c","Type":"ContainerStarted","Data":"2b5fea8cd1ee91744de361a1ad153201908a1323d578dad68c28fec16dbb2f96"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.960321 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" event={"ID":"c3c39950-97e6-423c-8884-b65548f38830","Type":"ContainerStarted","Data":"605617c442521ffc96d10c7ce209e1f6b634b065c5aec1b825d4ae3ca59e8f0c"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.964600 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" podUID="8a1ceaa0-b6e6-442d-84d0-3fba075b136c" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.969690 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" event={"ID":"508a8f28-1d71-43ac-b24b-65f226abf807","Type":"ContainerStarted","Data":"306e1ff0107452d1f6a38ed561a0da95de58b6ec8420ed08faf1bee72ba14a37"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.969729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" event={"ID":"508a8f28-1d71-43ac-b24b-65f226abf807","Type":"ContainerStarted","Data":"5b9a10af1d2c07087d8bc414bb697b5bc5895767bc841653a1e1ed40a1fb161a"} Sep 30 17:58:46 crc kubenswrapper[4797]: E0930 17:58:46.971604 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.120:5001/openstack-k8s-operators/watcher-operator:ddbfeae56acf552d63a96c2e6f420cfbcebbdcf9\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" podUID="508a8f28-1d71-43ac-b24b-65f226abf807" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.973566 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" event={"ID":"c9c41380-c9ee-4467-b343-0f6cf78d51bc","Type":"ContainerStarted","Data":"ec0fc2dfbd57c80db055130ad38aad2448215ceaa1e52ace9ad5f10055b1ee06"} Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.973999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:46 crc kubenswrapper[4797]: I0930 17:58:46.982855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a618912d-66f8-4486-8e69-d3dc16f3cb34-cert\") pod \"openstack-operator-controller-manager-5dc7c668c-46p2f\" (UID: \"a618912d-66f8-4486-8e69-d3dc16f3cb34\") " pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:47 crc kubenswrapper[4797]: I0930 17:58:47.056739 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:47 crc kubenswrapper[4797]: I0930 17:58:47.843380 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f"] Sep 30 17:58:47 crc kubenswrapper[4797]: I0930 17:58:47.982208 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" event={"ID":"d69ffa93-8979-4922-8aee-7ea26fede6b4","Type":"ContainerStarted","Data":"458d22b971de893f046294ceb776ee0d1b8b4794d8eddb0f61e7de42128c3cea"} Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.984979 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" podUID="d69ffa93-8979-4922-8aee-7ea26fede6b4" Sep 30 17:58:47 crc kubenswrapper[4797]: I0930 17:58:47.986678 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" event={"ID":"9508eece-17ea-4b43-9bdb-6c2f8da6e21f","Type":"ContainerStarted","Data":"0e056452394f3ecddfad7a63276f1e5a06d6563d466764989f04bfe788fea04e"} Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.988494 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" podUID="9508eece-17ea-4b43-9bdb-6c2f8da6e21f" Sep 30 17:58:47 crc kubenswrapper[4797]: I0930 17:58:47.990086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" event={"ID":"8a1ceaa0-b6e6-442d-84d0-3fba075b136c","Type":"ContainerStarted","Data":"aa7c175be2ff68b7e24de3ce474cf3040e9d31c1287aafe8e6717330d1f6fcc4"} Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.992238 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" podUID="ff0b391c-ac01-4a17-9381-a1e2b00d044d" Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.992280 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" podUID="ca63e090-a37a-4150-9c58-edf133c74c99" Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.992519 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" podUID="05a4965a-f8d6-4859-ab5d-87773f6f6981" Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.992748 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" podUID="1c601f06-9978-4f2b-8f37-2fa1bef8e8dd" Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.993013 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" podUID="8a1ceaa0-b6e6-442d-84d0-3fba075b136c" Sep 30 17:58:47 crc kubenswrapper[4797]: E0930 17:58:47.993054 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.120:5001/openstack-k8s-operators/watcher-operator:ddbfeae56acf552d63a96c2e6f420cfbcebbdcf9\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" podUID="508a8f28-1d71-43ac-b24b-65f226abf807" Sep 30 17:58:48 crc kubenswrapper[4797]: E0930 17:58:48.998835 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" podUID="8a1ceaa0-b6e6-442d-84d0-3fba075b136c" Sep 30 17:58:49 crc kubenswrapper[4797]: E0930 17:58:48.999606 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" podUID="d69ffa93-8979-4922-8aee-7ea26fede6b4" Sep 30 17:58:49 crc kubenswrapper[4797]: E0930 17:58:49.003340 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" podUID="9508eece-17ea-4b43-9bdb-6c2f8da6e21f" Sep 30 17:58:50 crc kubenswrapper[4797]: W0930 17:58:50.579402 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda618912d_66f8_4486_8e69_d3dc16f3cb34.slice/crio-bc1be5e00703fe30e7c1e6ccb979802e1428c8ae5e7455f4c77181f6ffa8186b WatchSource:0}: Error finding container bc1be5e00703fe30e7c1e6ccb979802e1428c8ae5e7455f4c77181f6ffa8186b: Status 404 returned error can't find the container with id bc1be5e00703fe30e7c1e6ccb979802e1428c8ae5e7455f4c77181f6ffa8186b Sep 30 17:58:51 crc kubenswrapper[4797]: I0930 17:58:51.012858 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" event={"ID":"a618912d-66f8-4486-8e69-d3dc16f3cb34","Type":"ContainerStarted","Data":"bc1be5e00703fe30e7c1e6ccb979802e1428c8ae5e7455f4c77181f6ffa8186b"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.104418 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" event={"ID":"c9c41380-c9ee-4467-b343-0f6cf78d51bc","Type":"ContainerStarted","Data":"592fcf8b6d5b101a1e7e145950bf78f730a575e78812c9ba4fb273871fde54eb"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.108411 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" event={"ID":"481318fa-263c-4a4b-b775-879776670ddb","Type":"ContainerStarted","Data":"17b94882b72a72bb9db0da4e089ff3c9fd6770af68b23f591e629a55ea2d704b"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.112216 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" event={"ID":"be859676-32a8-4144-94fd-ab0da94ce6bc","Type":"ContainerStarted","Data":"26b74cf921cc8062d42a57bd2dba60aa4ef69662313af58edfd297f0e8ff938f"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.114606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" event={"ID":"72a082c8-41b8-4666-bdd1-8f998dedc4c3","Type":"ContainerStarted","Data":"6d70973e9129eb48678b34964f02a45ecccaf284941b09c27873280800eafff5"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.122993 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" event={"ID":"24d8b2e3-5124-4bac-8cb1-871daabad7e6","Type":"ContainerStarted","Data":"60ff80d1bf374f13259be11979885380e5d975fc9dc0f8cc4166b6bac6335d22"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.124045 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" event={"ID":"04f5a5c7-892a-4aa5-8e21-ec847d9e29fb","Type":"ContainerStarted","Data":"d26f60438e6e8929fb45ca5ca0f37599c6ce5248f4f392f12044a3e2c628dfef"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.124876 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" event={"ID":"c3c39950-97e6-423c-8884-b65548f38830","Type":"ContainerStarted","Data":"767d34ebeb06c3a58cd32ca2455cdaccb6f7f33171b643ab63c71f8ae02d4301"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.126155 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" event={"ID":"8824a3d0-28dc-42eb-b767-b9425f556076","Type":"ContainerStarted","Data":"83ef1432d97318c4a04e5b7244483931786371f407eeedf49c46bbe0cba7eb08"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.136926 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" event={"ID":"3195fff1-53f5-491a-869b-0f7fc5e45df6","Type":"ContainerStarted","Data":"1ae8a86a3d2a71eaeb9586e0eba3db119cd9482d4839d95e4fe12eebfc5bd545"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.150846 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" event={"ID":"a618912d-66f8-4486-8e69-d3dc16f3cb34","Type":"ContainerStarted","Data":"4aac4747bc86ac126873f04933723a74a20a3dcac4ad3e766ddcbf8411dc6e34"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.173333 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" event={"ID":"07a9edee-b2ec-48d8-85b3-191f2f29bf73","Type":"ContainerStarted","Data":"fdd71bce90101d3065e5f473f5a0c83cad63543635973d0f5d2eeb9bf934d49b"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.183775 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" event={"ID":"9f9e430f-f1af-46a5-9885-2e25473d376d","Type":"ContainerStarted","Data":"514a845a12cb6322638206a494922c3f014a69b27a06fe5c206619a57735e43a"} Sep 30 17:58:57 crc kubenswrapper[4797]: I0930 17:58:57.209000 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" event={"ID":"3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa","Type":"ContainerStarted","Data":"1241c27f88e3ddbe6ac9558b662242160750dc4e82c4c3d9c110f378e285eba7"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.217040 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" event={"ID":"24d8b2e3-5124-4bac-8cb1-871daabad7e6","Type":"ContainerStarted","Data":"0505a73320459ad0aed635dd159aa81e15278a6573b854f624fda56e4196dd6b"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.217489 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.219252 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" event={"ID":"c9c41380-c9ee-4467-b343-0f6cf78d51bc","Type":"ContainerStarted","Data":"b91d86d5434aabdb0c4a8ad2676b9c9af4d8c0a68f47a318cfc71696a4dad109"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.219950 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.221827 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" event={"ID":"a618912d-66f8-4486-8e69-d3dc16f3cb34","Type":"ContainerStarted","Data":"bb40f57a6df0f299d9781caa3b10ceb57b861051bf1a5c08224477f4499ab6bc"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.222214 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.223858 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" event={"ID":"481318fa-263c-4a4b-b775-879776670ddb","Type":"ContainerStarted","Data":"00b3964a690d4cc4431ef3451a235a0aa4e572cf47fce34eeafbf40573d29a41"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.224286 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.225675 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" event={"ID":"8824a3d0-28dc-42eb-b767-b9425f556076","Type":"ContainerStarted","Data":"61f170134dc4985f947313287d37d3363fa57565fdf27d19c43793398418d118"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.226089 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.228653 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" event={"ID":"3195fff1-53f5-491a-869b-0f7fc5e45df6","Type":"ContainerStarted","Data":"76c6ea6d63d471d4beedce432320eb7c8d80bb1fea5cbf1ad9d2c02a8d56540a"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.228827 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.230248 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" event={"ID":"be859676-32a8-4144-94fd-ab0da94ce6bc","Type":"ContainerStarted","Data":"1ccfc4f9ab7880ae28be7641042a548563754bbe5f13732fda6e2b25781961c3"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.230366 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.231661 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" event={"ID":"9f9e430f-f1af-46a5-9885-2e25473d376d","Type":"ContainerStarted","Data":"353aa731c452363071e7f3cac8c4f8012876b5cdf2f48554bc757aa6604b001e"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.231779 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.232924 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" event={"ID":"3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa","Type":"ContainerStarted","Data":"9eabd55399b027d3e2418dca1ce9d25d7bf6118a2520eb69aec47da76165af5c"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.233297 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.234676 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" event={"ID":"72a082c8-41b8-4666-bdd1-8f998dedc4c3","Type":"ContainerStarted","Data":"4f4a1ede2b103e3ae9395ceb07ab89dacf39bdbb3724f487ec89a105a37620a4"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.235083 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.242925 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" podStartSLOduration=3.957299254 podStartE2EDuration="14.242909676s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.148490463 +0000 UTC m=+976.670989701" lastFinishedPulling="2025-09-30 17:58:56.434100865 +0000 UTC m=+986.956600123" observedRunningTime="2025-09-30 17:58:58.238331632 +0000 UTC m=+988.760830870" watchObservedRunningTime="2025-09-30 17:58:58.242909676 +0000 UTC m=+988.765408914" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247348 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" event={"ID":"04f5a5c7-892a-4aa5-8e21-ec847d9e29fb","Type":"ContainerStarted","Data":"3569ec1ff82d60402e419847b4a087d8a881b236637da420394dc3643bd25f77"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247487 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" event={"ID":"89b215ec-763f-4eb9-aef0-7f5b1d43481d","Type":"ContainerStarted","Data":"f646d257bdca007be5ae53776e3de902c6822a57377d0f038ef88feca1d58dc1"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247568 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247652 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" event={"ID":"89b215ec-763f-4eb9-aef0-7f5b1d43481d","Type":"ContainerStarted","Data":"a6e9e17b98d52a4e067a5141e2de2dd7e5aa9115d3bdb73923284116fd47b7f8"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" event={"ID":"07a9edee-b2ec-48d8-85b3-191f2f29bf73","Type":"ContainerStarted","Data":"29028221a69a8a079eb6293298e0e02cbd01f750260f8b111bb624e9dcb347e1"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.247821 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.248817 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" event={"ID":"c3c39950-97e6-423c-8884-b65548f38830","Type":"ContainerStarted","Data":"615a06aa54d87398e07210b6a3bdf54c4ced4003751a5e6ce2e8b79e681930c5"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.249618 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.251039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" event={"ID":"65d324d6-26a4-4a59-a29d-a92cad26a07a","Type":"ContainerStarted","Data":"f1bcd08b5377f9b6018b53f41591bb40b4dfe3b1fbe1fc55363a087edc1f438a"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.251068 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" event={"ID":"65d324d6-26a4-4a59-a29d-a92cad26a07a","Type":"ContainerStarted","Data":"e0aa0e6a74ae8083f50a478db262b26d121b15b07cf03ba61d6af135ec28789c"} Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.251638 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.263926 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" podStartSLOduration=4.001723038 podStartE2EDuration="14.26391152s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.152604226 +0000 UTC m=+976.675103464" lastFinishedPulling="2025-09-30 17:58:56.414792688 +0000 UTC m=+986.937291946" observedRunningTime="2025-09-30 17:58:58.262412959 +0000 UTC m=+988.784912197" watchObservedRunningTime="2025-09-30 17:58:58.26391152 +0000 UTC m=+988.786410758" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.280162 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" podStartSLOduration=4.375923582 podStartE2EDuration="14.280145513s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.528878296 +0000 UTC m=+977.051377534" lastFinishedPulling="2025-09-30 17:58:56.433100217 +0000 UTC m=+986.955599465" observedRunningTime="2025-09-30 17:58:58.274221442 +0000 UTC m=+988.796720680" watchObservedRunningTime="2025-09-30 17:58:58.280145513 +0000 UTC m=+988.802644751" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.291610 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" podStartSLOduration=4.407458084 podStartE2EDuration="14.291594316s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.552993815 +0000 UTC m=+977.075493053" lastFinishedPulling="2025-09-30 17:58:56.437130047 +0000 UTC m=+986.959629285" observedRunningTime="2025-09-30 17:58:58.290465864 +0000 UTC m=+988.812965102" watchObservedRunningTime="2025-09-30 17:58:58.291594316 +0000 UTC m=+988.814093554" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.311647 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" podStartSLOduration=3.865211272 podStartE2EDuration="14.311630212s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:45.956616707 +0000 UTC m=+976.479115945" lastFinishedPulling="2025-09-30 17:58:56.403035647 +0000 UTC m=+986.925534885" observedRunningTime="2025-09-30 17:58:58.310581434 +0000 UTC m=+988.833080672" watchObservedRunningTime="2025-09-30 17:58:58.311630212 +0000 UTC m=+988.834129440" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.331468 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" podStartSLOduration=3.284496571 podStartE2EDuration="14.331453293s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:45.36800475 +0000 UTC m=+975.890503988" lastFinishedPulling="2025-09-30 17:58:56.414961472 +0000 UTC m=+986.937460710" observedRunningTime="2025-09-30 17:58:58.329388637 +0000 UTC m=+988.851887875" watchObservedRunningTime="2025-09-30 17:58:58.331453293 +0000 UTC m=+988.853952531" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.354287 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" podStartSLOduration=4.095861658 podStartE2EDuration="14.354272036s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.152596996 +0000 UTC m=+976.675096234" lastFinishedPulling="2025-09-30 17:58:56.411007354 +0000 UTC m=+986.933506612" observedRunningTime="2025-09-30 17:58:58.351742667 +0000 UTC m=+988.874241905" watchObservedRunningTime="2025-09-30 17:58:58.354272036 +0000 UTC m=+988.876771274" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.399286 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" podStartSLOduration=14.399270134 podStartE2EDuration="14.399270134s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:58:58.397074214 +0000 UTC m=+988.919573482" watchObservedRunningTime="2025-09-30 17:58:58.399270134 +0000 UTC m=+988.921769372" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.425670 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" podStartSLOduration=4.042559993 podStartE2EDuration="14.425648775s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.051018263 +0000 UTC m=+976.573517491" lastFinishedPulling="2025-09-30 17:58:56.434107025 +0000 UTC m=+986.956606273" observedRunningTime="2025-09-30 17:58:58.421572103 +0000 UTC m=+988.944071351" watchObservedRunningTime="2025-09-30 17:58:58.425648775 +0000 UTC m=+988.948148013" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.455398 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" podStartSLOduration=3.957649835 podStartE2EDuration="14.455381586s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:45.917044756 +0000 UTC m=+976.439543994" lastFinishedPulling="2025-09-30 17:58:56.414776507 +0000 UTC m=+986.937275745" observedRunningTime="2025-09-30 17:58:58.453789253 +0000 UTC m=+988.976288491" watchObservedRunningTime="2025-09-30 17:58:58.455381586 +0000 UTC m=+988.977880824" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.475496 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" podStartSLOduration=4.233315999 podStartE2EDuration="14.475475944s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.168834378 +0000 UTC m=+976.691333616" lastFinishedPulling="2025-09-30 17:58:56.410994323 +0000 UTC m=+986.933493561" observedRunningTime="2025-09-30 17:58:58.473881811 +0000 UTC m=+988.996381049" watchObservedRunningTime="2025-09-30 17:58:58.475475944 +0000 UTC m=+988.997975192" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.516364 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" podStartSLOduration=4.152117403 podStartE2EDuration="14.51634792s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.050595731 +0000 UTC m=+976.573094969" lastFinishedPulling="2025-09-30 17:58:56.414826248 +0000 UTC m=+986.937325486" observedRunningTime="2025-09-30 17:58:58.497366792 +0000 UTC m=+989.019866030" watchObservedRunningTime="2025-09-30 17:58:58.51634792 +0000 UTC m=+989.038847158" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.517194 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" podStartSLOduration=4.020362847 podStartE2EDuration="14.517190683s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:45.918241259 +0000 UTC m=+976.440740497" lastFinishedPulling="2025-09-30 17:58:56.415069085 +0000 UTC m=+986.937568333" observedRunningTime="2025-09-30 17:58:58.515064616 +0000 UTC m=+989.037563854" watchObservedRunningTime="2025-09-30 17:58:58.517190683 +0000 UTC m=+989.039689921" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.531008 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" podStartSLOduration=4.132689652 podStartE2EDuration="14.53099249s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.036942518 +0000 UTC m=+976.559441756" lastFinishedPulling="2025-09-30 17:58:56.435245366 +0000 UTC m=+986.957744594" observedRunningTime="2025-09-30 17:58:58.52880516 +0000 UTC m=+989.051304398" watchObservedRunningTime="2025-09-30 17:58:58.53099249 +0000 UTC m=+989.053491728" Sep 30 17:58:58 crc kubenswrapper[4797]: I0930 17:58:58.549662 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" podStartSLOduration=4.106770995 podStartE2EDuration="14.549647839s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:45.972051477 +0000 UTC m=+976.494550715" lastFinishedPulling="2025-09-30 17:58:56.414928311 +0000 UTC m=+986.937427559" observedRunningTime="2025-09-30 17:58:58.54746666 +0000 UTC m=+989.069965918" watchObservedRunningTime="2025-09-30 17:58:58.549647839 +0000 UTC m=+989.072147077" Sep 30 17:59:01 crc kubenswrapper[4797]: I0930 17:59:01.277811 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" event={"ID":"508a8f28-1d71-43ac-b24b-65f226abf807","Type":"ContainerStarted","Data":"7bb993e2fea5098cd5ea51a41dbdef5a77168aa3108e70769269170973b0a54e"} Sep 30 17:59:01 crc kubenswrapper[4797]: I0930 17:59:01.279219 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:59:01 crc kubenswrapper[4797]: I0930 17:59:01.302975 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" podStartSLOduration=3.635603986 podStartE2EDuration="17.302959982s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.567574323 +0000 UTC m=+977.090073551" lastFinishedPulling="2025-09-30 17:59:00.234930289 +0000 UTC m=+990.757429547" observedRunningTime="2025-09-30 17:59:01.295213571 +0000 UTC m=+991.817712809" watchObservedRunningTime="2025-09-30 17:59:01.302959982 +0000 UTC m=+991.825459210" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.313263 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" event={"ID":"ca63e090-a37a-4150-9c58-edf133c74c99","Type":"ContainerStarted","Data":"17c06454b0a11a43df1045050bbe55266e050a0cc82793c948f1f50c35a7bf1b"} Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.317733 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" event={"ID":"1c601f06-9978-4f2b-8f37-2fa1bef8e8dd","Type":"ContainerStarted","Data":"b82fc2b47a47fd6ea509078e40daa1921d5360c4caa6aaa25080e3e4e6c34484"} Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.318025 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.328622 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-gcwrs" podStartSLOduration=2.768838543 podStartE2EDuration="19.328570338s" podCreationTimestamp="2025-09-30 17:58:45 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.607877563 +0000 UTC m=+977.130376801" lastFinishedPulling="2025-09-30 17:59:03.167609348 +0000 UTC m=+993.690108596" observedRunningTime="2025-09-30 17:59:04.326894622 +0000 UTC m=+994.849393870" watchObservedRunningTime="2025-09-30 17:59:04.328570338 +0000 UTC m=+994.851069576" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.352927 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" podStartSLOduration=3.790803421 podStartE2EDuration="20.352911282s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.574275985 +0000 UTC m=+977.096775233" lastFinishedPulling="2025-09-30 17:59:03.136383866 +0000 UTC m=+993.658883094" observedRunningTime="2025-09-30 17:59:04.350184807 +0000 UTC m=+994.872684045" watchObservedRunningTime="2025-09-30 17:59:04.352911282 +0000 UTC m=+994.875410520" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.633289 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-csf68" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.675069 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-wvthj" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.693809 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-85wtx" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.716517 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-585mg" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.734406 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h2z55" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.842156 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-d2mm8" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.922738 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4wpww" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.944864 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-slc6t" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.951353 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vlc4g" Sep 30 17:59:04 crc kubenswrapper[4797]: I0930 17:59:04.999090 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n564q" Sep 30 17:59:05 crc kubenswrapper[4797]: I0930 17:59:05.035520 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-d7r9l" Sep 30 17:59:05 crc kubenswrapper[4797]: I0930 17:59:05.061328 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-c58zh" Sep 30 17:59:05 crc kubenswrapper[4797]: I0930 17:59:05.262839 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-grp8f" Sep 30 17:59:05 crc kubenswrapper[4797]: I0930 17:59:05.346687 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-66ff6" Sep 30 17:59:05 crc kubenswrapper[4797]: I0930 17:59:05.537919 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-598db9dcc9-jbsh8" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.334615 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" event={"ID":"05a4965a-f8d6-4859-ab5d-87773f6f6981","Type":"ContainerStarted","Data":"cabc625998b97bb0636895a1eebfb7514f52ea64b3af5539a53d34119a34d817"} Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.335446 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.338036 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" event={"ID":"9508eece-17ea-4b43-9bdb-6c2f8da6e21f","Type":"ContainerStarted","Data":"7775ce043c5d10bcb0ed44d6e9ac3ff0231399d14eb17a4178d9f2c21208d3b1"} Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.338240 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.339608 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" event={"ID":"8a1ceaa0-b6e6-442d-84d0-3fba075b136c","Type":"ContainerStarted","Data":"86c05d8e9006391ed03a5a44dbb2672ef947ed237bce58dab16bcb49cb19fa4e"} Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.339810 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.341275 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" event={"ID":"ff0b391c-ac01-4a17-9381-a1e2b00d044d","Type":"ContainerStarted","Data":"4aa67f21a6a46205bd98e617f9affa5b5d37b82bbe14d45be4fa187e0c7b69c6"} Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.341463 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.357831 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" podStartSLOduration=3.375476015 podStartE2EDuration="22.357813497s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.616312043 +0000 UTC m=+977.138811281" lastFinishedPulling="2025-09-30 17:59:05.598649515 +0000 UTC m=+996.121148763" observedRunningTime="2025-09-30 17:59:06.352117901 +0000 UTC m=+996.874617149" watchObservedRunningTime="2025-09-30 17:59:06.357813497 +0000 UTC m=+996.880312735" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.369475 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" podStartSLOduration=3.407508818 podStartE2EDuration="22.369457634s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.630168841 +0000 UTC m=+977.152668079" lastFinishedPulling="2025-09-30 17:59:05.592117657 +0000 UTC m=+996.114616895" observedRunningTime="2025-09-30 17:59:06.368341264 +0000 UTC m=+996.890840502" watchObservedRunningTime="2025-09-30 17:59:06.369457634 +0000 UTC m=+996.891956872" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.396658 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" podStartSLOduration=3.376067721 podStartE2EDuration="22.396640457s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.572407034 +0000 UTC m=+977.094906272" lastFinishedPulling="2025-09-30 17:59:05.59297975 +0000 UTC m=+996.115479008" observedRunningTime="2025-09-30 17:59:06.395247559 +0000 UTC m=+996.917746807" watchObservedRunningTime="2025-09-30 17:59:06.396640457 +0000 UTC m=+996.919139695" Sep 30 17:59:06 crc kubenswrapper[4797]: I0930 17:59:06.417528 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" podStartSLOduration=3.370771936 podStartE2EDuration="22.417510266s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.565551637 +0000 UTC m=+977.088050875" lastFinishedPulling="2025-09-30 17:59:05.612289957 +0000 UTC m=+996.134789205" observedRunningTime="2025-09-30 17:59:06.414687589 +0000 UTC m=+996.937186837" watchObservedRunningTime="2025-09-30 17:59:06.417510266 +0000 UTC m=+996.940009504" Sep 30 17:59:07 crc kubenswrapper[4797]: I0930 17:59:07.066072 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5dc7c668c-46p2f" Sep 30 17:59:07 crc kubenswrapper[4797]: I0930 17:59:07.378507 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" event={"ID":"d69ffa93-8979-4922-8aee-7ea26fede6b4","Type":"ContainerStarted","Data":"bdce3456f4889e34b933645b9b370bcfb46e7ed3016e8091f90e2f2e75a0ddc5"} Sep 30 17:59:07 crc kubenswrapper[4797]: I0930 17:59:07.430822 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" podStartSLOduration=3.202811092 podStartE2EDuration="23.430802445s" podCreationTimestamp="2025-09-30 17:58:44 +0000 UTC" firstStartedPulling="2025-09-30 17:58:46.630046598 +0000 UTC m=+977.152545836" lastFinishedPulling="2025-09-30 17:59:06.858037901 +0000 UTC m=+997.380537189" observedRunningTime="2025-09-30 17:59:07.417236954 +0000 UTC m=+997.939736222" watchObservedRunningTime="2025-09-30 17:59:07.430802445 +0000 UTC m=+997.953301693" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.019078 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-qlrv8" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.090597 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-dbxcb" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.136324 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-xh5rb" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.271615 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5vhf6" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.510495 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-9zcc2" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.713113 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:59:15 crc kubenswrapper[4797]: I0930 17:59:15.721569 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-zj2w6" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.123939 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-647hm"] Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.126044 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.130694 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4jgq2" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.130751 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.131475 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.134820 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.135085 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-647hm"] Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.217688 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jk6wp"] Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.220192 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.224836 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.229553 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jk6wp"] Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.300676 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-config\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.300835 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.300904 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564b4715-8914-4183-b843-4d87fb41145a-config\") pod \"dnsmasq-dns-675f4bcbfc-647hm\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.300930 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qfl\" (UniqueName: \"kubernetes.io/projected/564b4715-8914-4183-b843-4d87fb41145a-kube-api-access-44qfl\") pod \"dnsmasq-dns-675f4bcbfc-647hm\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.300991 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv9pd\" (UniqueName: \"kubernetes.io/projected/8e53c2c0-a774-4a3e-accc-40524c99ea82-kube-api-access-gv9pd\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.401706 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.401761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564b4715-8914-4183-b843-4d87fb41145a-config\") pod \"dnsmasq-dns-675f4bcbfc-647hm\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.401783 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qfl\" (UniqueName: \"kubernetes.io/projected/564b4715-8914-4183-b843-4d87fb41145a-kube-api-access-44qfl\") pod \"dnsmasq-dns-675f4bcbfc-647hm\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.401804 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv9pd\" (UniqueName: \"kubernetes.io/projected/8e53c2c0-a774-4a3e-accc-40524c99ea82-kube-api-access-gv9pd\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.401847 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-config\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.402753 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.402794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-config\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.403339 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564b4715-8914-4183-b843-4d87fb41145a-config\") pod \"dnsmasq-dns-675f4bcbfc-647hm\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.421236 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv9pd\" (UniqueName: \"kubernetes.io/projected/8e53c2c0-a774-4a3e-accc-40524c99ea82-kube-api-access-gv9pd\") pod \"dnsmasq-dns-78dd6ddcc-jk6wp\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.421898 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qfl\" (UniqueName: \"kubernetes.io/projected/564b4715-8914-4183-b843-4d87fb41145a-kube-api-access-44qfl\") pod \"dnsmasq-dns-675f4bcbfc-647hm\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.456491 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.576002 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:34 crc kubenswrapper[4797]: I0930 17:59:34.908598 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-647hm"] Sep 30 17:59:35 crc kubenswrapper[4797]: I0930 17:59:35.027182 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jk6wp"] Sep 30 17:59:35 crc kubenswrapper[4797]: W0930 17:59:35.039158 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e53c2c0_a774_4a3e_accc_40524c99ea82.slice/crio-ff9469db42bc100c9eb0304cf476218f5f75e683dc310d44122455f788312c8f WatchSource:0}: Error finding container ff9469db42bc100c9eb0304cf476218f5f75e683dc310d44122455f788312c8f: Status 404 returned error can't find the container with id ff9469db42bc100c9eb0304cf476218f5f75e683dc310d44122455f788312c8f Sep 30 17:59:35 crc kubenswrapper[4797]: I0930 17:59:35.660820 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" event={"ID":"8e53c2c0-a774-4a3e-accc-40524c99ea82","Type":"ContainerStarted","Data":"ff9469db42bc100c9eb0304cf476218f5f75e683dc310d44122455f788312c8f"} Sep 30 17:59:35 crc kubenswrapper[4797]: I0930 17:59:35.662796 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" event={"ID":"564b4715-8914-4183-b843-4d87fb41145a","Type":"ContainerStarted","Data":"b66cee8580afb9097459e7c8fb7e234a0332499efa898eb103fb64093d477d67"} Sep 30 17:59:36 crc kubenswrapper[4797]: I0930 17:59:36.904569 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-647hm"] Sep 30 17:59:36 crc kubenswrapper[4797]: I0930 17:59:36.921987 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qcpsp"] Sep 30 17:59:36 crc kubenswrapper[4797]: I0930 17:59:36.925352 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:36 crc kubenswrapper[4797]: I0930 17:59:36.938649 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qcpsp"] Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.048011 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxdt\" (UniqueName: \"kubernetes.io/projected/6c5a18c6-a06e-4ef7-aa5a-078979990eee-kube-api-access-dtxdt\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.048059 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-dns-svc\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.048084 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-config\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.140581 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jk6wp"] Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.151208 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-config\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.151467 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxdt\" (UniqueName: \"kubernetes.io/projected/6c5a18c6-a06e-4ef7-aa5a-078979990eee-kube-api-access-dtxdt\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.151510 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-dns-svc\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.152318 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-dns-svc\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.152507 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-config\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.161870 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ch6v6"] Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.163618 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.175559 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxdt\" (UniqueName: \"kubernetes.io/projected/6c5a18c6-a06e-4ef7-aa5a-078979990eee-kube-api-access-dtxdt\") pod \"dnsmasq-dns-666b6646f7-qcpsp\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.192882 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ch6v6"] Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.247563 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.354094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-config\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.354150 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqfb\" (UniqueName: \"kubernetes.io/projected/9e788f55-44b8-4d71-b419-023ad236d45c-kube-api-access-thqfb\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.354191 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.458445 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-config\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.458491 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqfb\" (UniqueName: \"kubernetes.io/projected/9e788f55-44b8-4d71-b419-023ad236d45c-kube-api-access-thqfb\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.458542 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.459767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-config\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.461190 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.477976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqfb\" (UniqueName: \"kubernetes.io/projected/9e788f55-44b8-4d71-b419-023ad236d45c-kube-api-access-thqfb\") pod \"dnsmasq-dns-57d769cc4f-ch6v6\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.520810 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 17:59:37 crc kubenswrapper[4797]: I0930 17:59:37.753099 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qcpsp"] Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.030024 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.031910 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.037030 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.037122 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.037271 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.037368 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.037512 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.037690 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.038640 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ch6v6"] Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.039030 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-spc9v" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.046653 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075236 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075331 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075384 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075406 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075450 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65610b42-1ed9-4a27-996a-09e0ebd560e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075503 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkth4\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-kube-api-access-mkth4\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075525 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65610b42-1ed9-4a27-996a-09e0ebd560e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075864 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.075901 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.176904 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.176993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177020 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177047 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177086 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177100 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177118 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177140 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65610b42-1ed9-4a27-996a-09e0ebd560e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177157 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkth4\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-kube-api-access-mkth4\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177173 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65610b42-1ed9-4a27-996a-09e0ebd560e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.177197 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.178292 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.178451 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.178588 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.178953 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.179166 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.182598 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65610b42-1ed9-4a27-996a-09e0ebd560e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.182962 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.184204 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65610b42-1ed9-4a27-996a-09e0ebd560e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.196118 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.196317 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.202674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkth4\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-kube-api-access-mkth4\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.210756 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.299467 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.301032 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.305675 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.306396 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.306503 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.306413 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.306414 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.310756 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.310971 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lx7d9" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.311158 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.364259 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388066 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388134 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388205 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49krn\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-kube-api-access-49krn\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388250 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388282 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388333 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388364 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388490 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388513 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388534 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.388555 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.489982 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490025 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490073 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490092 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490110 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490126 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490151 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490175 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490203 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49krn\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-kube-api-access-49krn\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490228 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.490916 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.493208 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.496364 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.496457 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.497025 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.497266 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.500358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.501495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.507359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.509388 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.512361 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49krn\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-kube-api-access-49krn\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.518562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.620116 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.707621 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" event={"ID":"6c5a18c6-a06e-4ef7-aa5a-078979990eee","Type":"ContainerStarted","Data":"482d0d18d44ec1b3b0d2e569aaa425d2b65826ca281dc4e59ed4b3063cc5b717"} Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.708742 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" event={"ID":"9e788f55-44b8-4d71-b419-023ad236d45c","Type":"ContainerStarted","Data":"2def468362e5d068b187faaea74d4262d28dc85cb19d4b1aab19d4ca79c47539"} Sep 30 17:59:38 crc kubenswrapper[4797]: I0930 17:59:38.901077 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:59:39 crc kubenswrapper[4797]: I0930 17:59:39.190651 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.156688 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.158661 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.164186 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nvj6w" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.166093 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.166527 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.166811 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.167543 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.169375 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.170558 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275488 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275542 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275564 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b64d922-5a17-4831-b87e-78ae0a9a9042-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275635 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmpj\" (UniqueName: \"kubernetes.io/projected/9b64d922-5a17-4831-b87e-78ae0a9a9042-kube-api-access-xcmpj\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275659 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275687 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275704 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.275720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.280298 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.285959 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.292308 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.292390 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.292454 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.292487 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-w9kjr" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.315886 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.376831 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.376878 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.376929 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-secrets\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.376946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.376966 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.376981 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377007 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377045 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377063 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377078 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377105 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c386be5e-6533-42b6-8a82-512c4c60cab2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377125 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-config-data-default\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377150 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b64d922-5a17-4831-b87e-78ae0a9a9042-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377172 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377187 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvklz\" (UniqueName: \"kubernetes.io/projected/c386be5e-6533-42b6-8a82-512c4c60cab2-kube-api-access-tvklz\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377202 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-kolla-config\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377246 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmpj\" (UniqueName: \"kubernetes.io/projected/9b64d922-5a17-4831-b87e-78ae0a9a9042-kube-api-access-xcmpj\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377364 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377658 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.377946 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b64d922-5a17-4831-b87e-78ae0a9a9042-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.378205 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.380035 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b64d922-5a17-4831-b87e-78ae0a9a9042-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.389384 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.407984 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.408539 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9b64d922-5a17-4831-b87e-78ae0a9a9042-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.412871 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmpj\" (UniqueName: \"kubernetes.io/projected/9b64d922-5a17-4831-b87e-78ae0a9a9042-kube-api-access-xcmpj\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.427942 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b64d922-5a17-4831-b87e-78ae0a9a9042\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478128 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478223 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c386be5e-6533-42b6-8a82-512c4c60cab2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478244 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-config-data-default\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478272 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478288 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvklz\" (UniqueName: \"kubernetes.io/projected/c386be5e-6533-42b6-8a82-512c4c60cab2-kube-api-access-tvklz\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478304 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-kolla-config\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478324 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478352 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.478379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-secrets\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.480046 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c386be5e-6533-42b6-8a82-512c4c60cab2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.480269 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.480309 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.480941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-config-data-default\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.481243 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c386be5e-6533-42b6-8a82-512c4c60cab2-kolla-config\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.481679 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.486832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.487665 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-secrets\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.489325 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c386be5e-6533-42b6-8a82-512c4c60cab2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.525718 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvklz\" (UniqueName: \"kubernetes.io/projected/c386be5e-6533-42b6-8a82-512c4c60cab2-kube-api-access-tvklz\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.531532 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"c386be5e-6533-42b6-8a82-512c4c60cab2\") " pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.555872 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.556759 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.569179 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.569778 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9v577" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.569904 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.587781 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.611962 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.681075 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-kolla-config\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.681127 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.681178 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-config-data\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.681249 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.681283 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w962l\" (UniqueName: \"kubernetes.io/projected/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-kube-api-access-w962l\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.782758 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.782819 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w962l\" (UniqueName: \"kubernetes.io/projected/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-kube-api-access-w962l\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.782869 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-kolla-config\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.782895 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.782940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-config-data\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.783800 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-config-data\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.784055 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-kolla-config\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.786042 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.788305 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.803235 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w962l\" (UniqueName: \"kubernetes.io/projected/6e728b8d-50b8-43df-bb8c-e3cbfce614e9-kube-api-access-w962l\") pod \"memcached-0\" (UID: \"6e728b8d-50b8-43df-bb8c-e3cbfce614e9\") " pod="openstack/memcached-0" Sep 30 17:59:41 crc kubenswrapper[4797]: I0930 17:59:41.883944 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 17:59:43 crc kubenswrapper[4797]: W0930 17:59:43.053619 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65610b42_1ed9_4a27_996a_09e0ebd560e5.slice/crio-d9e349f7b742a7281d6b3dac55b1fcbb11bf69f22d50fa610428485307bf03bd WatchSource:0}: Error finding container d9e349f7b742a7281d6b3dac55b1fcbb11bf69f22d50fa610428485307bf03bd: Status 404 returned error can't find the container with id d9e349f7b742a7281d6b3dac55b1fcbb11bf69f22d50fa610428485307bf03bd Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.499924 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.501265 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.502632 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mcpfs" Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.514106 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvqvn\" (UniqueName: \"kubernetes.io/projected/6270d3ec-66cb-42c0-97d5-a83f4fe7c854-kube-api-access-pvqvn\") pod \"kube-state-metrics-0\" (UID: \"6270d3ec-66cb-42c0-97d5-a83f4fe7c854\") " pod="openstack/kube-state-metrics-0" Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.514658 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.615488 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvqvn\" (UniqueName: \"kubernetes.io/projected/6270d3ec-66cb-42c0-97d5-a83f4fe7c854-kube-api-access-pvqvn\") pod \"kube-state-metrics-0\" (UID: \"6270d3ec-66cb-42c0-97d5-a83f4fe7c854\") " pod="openstack/kube-state-metrics-0" Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.655306 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvqvn\" (UniqueName: \"kubernetes.io/projected/6270d3ec-66cb-42c0-97d5-a83f4fe7c854-kube-api-access-pvqvn\") pod \"kube-state-metrics-0\" (UID: \"6270d3ec-66cb-42c0-97d5-a83f4fe7c854\") " pod="openstack/kube-state-metrics-0" Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.782396 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"65610b42-1ed9-4a27-996a-09e0ebd560e5","Type":"ContainerStarted","Data":"d9e349f7b742a7281d6b3dac55b1fcbb11bf69f22d50fa610428485307bf03bd"} Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.786233 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a69c5e9-777c-48ad-8af7-78e770d2a9b2","Type":"ContainerStarted","Data":"6ab52c034f0b6d1519cfa788b0a1bb05d7062840b05df53c1b5a2a93fe212dea"} Sep 30 17:59:43 crc kubenswrapper[4797]: I0930 17:59:43.816346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.873640 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.878800 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.886223 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-n6tgk" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.886507 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.886640 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.888681 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.889185 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.889301 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.896094 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942462 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942524 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942582 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942616 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942640 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6kp\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-kube-api-access-cw6kp\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942666 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942711 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:44 crc kubenswrapper[4797]: I0930 17:59:44.942738 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.043994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044062 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044131 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044198 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044255 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044287 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6kp\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-kube-api-access-cw6kp\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.044334 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.045268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.050469 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.050684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.051559 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.051586 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dc90504e8fd72bffa4b364aa4d9dd59b6e7cea028ec589d2165a3270de0ac3cc/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.051616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.053140 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.058684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.082310 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6kp\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-kube-api-access-cw6kp\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.092592 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:45 crc kubenswrapper[4797]: I0930 17:59:45.218774 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.724864 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nf4pk"] Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.726178 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.729720 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.729927 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-l6l8w" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.730088 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.745235 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nf4pk"] Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.766221 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dqkv4"] Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.770385 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.778281 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dqkv4"] Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802213 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-run\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802308 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-run\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802331 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a527992-92f7-4aab-b8d4-e75ec72fd684-scripts\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802353 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a527992-92f7-4aab-b8d4-e75ec72fd684-ovn-controller-tls-certs\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802375 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ksp8\" (UniqueName: \"kubernetes.io/projected/4a527992-92f7-4aab-b8d4-e75ec72fd684-kube-api-access-5ksp8\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802390 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a527992-92f7-4aab-b8d4-e75ec72fd684-combined-ca-bundle\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802407 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-run-ovn\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcq74\" (UniqueName: \"kubernetes.io/projected/2671b936-5121-4120-b39c-9686d92ed101-kube-api-access-zcq74\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802537 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-log\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-etc-ovs\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802752 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2671b936-5121-4120-b39c-9686d92ed101-scripts\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802770 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-log-ovn\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.802793 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-lib\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-run\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904854 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-run\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904882 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a527992-92f7-4aab-b8d4-e75ec72fd684-scripts\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904914 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a527992-92f7-4aab-b8d4-e75ec72fd684-ovn-controller-tls-certs\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904944 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ksp8\" (UniqueName: \"kubernetes.io/projected/4a527992-92f7-4aab-b8d4-e75ec72fd684-kube-api-access-5ksp8\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904969 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a527992-92f7-4aab-b8d4-e75ec72fd684-combined-ca-bundle\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.904993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-run-ovn\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905017 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcq74\" (UniqueName: \"kubernetes.io/projected/2671b936-5121-4120-b39c-9686d92ed101-kube-api-access-zcq74\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905050 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-log\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905122 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-etc-ovs\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2671b936-5121-4120-b39c-9686d92ed101-scripts\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905176 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-log-ovn\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905200 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-lib\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-run\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905270 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-run\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905588 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-lib\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905615 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-etc-ovs\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-log-ovn\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.905873 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a527992-92f7-4aab-b8d4-e75ec72fd684-var-run-ovn\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.906233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2671b936-5121-4120-b39c-9686d92ed101-var-log\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.907518 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2671b936-5121-4120-b39c-9686d92ed101-scripts\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.907883 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a527992-92f7-4aab-b8d4-e75ec72fd684-scripts\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.910069 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a527992-92f7-4aab-b8d4-e75ec72fd684-combined-ca-bundle\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.918039 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a527992-92f7-4aab-b8d4-e75ec72fd684-ovn-controller-tls-certs\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.925210 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ksp8\" (UniqueName: \"kubernetes.io/projected/4a527992-92f7-4aab-b8d4-e75ec72fd684-kube-api-access-5ksp8\") pod \"ovn-controller-nf4pk\" (UID: \"4a527992-92f7-4aab-b8d4-e75ec72fd684\") " pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:47 crc kubenswrapper[4797]: I0930 17:59:47.928072 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcq74\" (UniqueName: \"kubernetes.io/projected/2671b936-5121-4120-b39c-9686d92ed101-kube-api-access-zcq74\") pod \"ovn-controller-ovs-dqkv4\" (UID: \"2671b936-5121-4120-b39c-9686d92ed101\") " pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:48 crc kubenswrapper[4797]: I0930 17:59:48.042188 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nf4pk" Sep 30 17:59:48 crc kubenswrapper[4797]: I0930 17:59:48.092216 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 17:59:48 crc kubenswrapper[4797]: I0930 17:59:48.762112 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.350562 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.352365 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.355093 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.355325 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.355418 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.355582 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xb9fl" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.355797 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.389831 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430299 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df911ba-9b38-46e5-b779-3db695c839a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430441 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430514 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df911ba-9b38-46e5-b779-3db695c839a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430700 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjzmq\" (UniqueName: \"kubernetes.io/projected/5df911ba-9b38-46e5-b779-3db695c839a9-kube-api-access-wjzmq\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430758 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430861 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df911ba-9b38-46e5-b779-3db695c839a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.430897 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533142 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df911ba-9b38-46e5-b779-3db695c839a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df911ba-9b38-46e5-b779-3db695c839a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533309 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjzmq\" (UniqueName: \"kubernetes.io/projected/5df911ba-9b38-46e5-b779-3db695c839a9-kube-api-access-wjzmq\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533334 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df911ba-9b38-46e5-b779-3db695c839a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533400 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533487 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.533613 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.534352 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df911ba-9b38-46e5-b779-3db695c839a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.535016 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df911ba-9b38-46e5-b779-3db695c839a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.535521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df911ba-9b38-46e5-b779-3db695c839a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.540464 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.541369 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.548886 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df911ba-9b38-46e5-b779-3db695c839a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.555075 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjzmq\" (UniqueName: \"kubernetes.io/projected/5df911ba-9b38-46e5-b779-3db695c839a9-kube-api-access-wjzmq\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.565800 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df911ba-9b38-46e5-b779-3db695c839a9\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:49 crc kubenswrapper[4797]: I0930 17:59:49.680605 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.010204 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.011566 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.013393 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.013658 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.013665 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.015083 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-prc45" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.024520 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079317 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk4hb\" (UniqueName: \"kubernetes.io/projected/a3e42915-c1cf-479d-8cb1-d337a4407d64-kube-api-access-wk4hb\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079573 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079614 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3e42915-c1cf-479d-8cb1-d337a4407d64-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079651 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079776 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079886 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e42915-c1cf-479d-8cb1-d337a4407d64-config\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.079975 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3e42915-c1cf-479d-8cb1-d337a4407d64-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.080123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.182888 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.182954 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk4hb\" (UniqueName: \"kubernetes.io/projected/a3e42915-c1cf-479d-8cb1-d337a4407d64-kube-api-access-wk4hb\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.182997 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183017 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3e42915-c1cf-479d-8cb1-d337a4407d64-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183056 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183082 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e42915-c1cf-479d-8cb1-d337a4407d64-config\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183108 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3e42915-c1cf-479d-8cb1-d337a4407d64-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183173 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.183802 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3e42915-c1cf-479d-8cb1-d337a4407d64-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.184303 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3e42915-c1cf-479d-8cb1-d337a4407d64-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.185069 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e42915-c1cf-479d-8cb1-d337a4407d64-config\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.188179 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.190081 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.192965 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3e42915-c1cf-479d-8cb1-d337a4407d64-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.200580 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk4hb\" (UniqueName: \"kubernetes.io/projected/a3e42915-c1cf-479d-8cb1-d337a4407d64-kube-api-access-wk4hb\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.203023 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3e42915-c1cf-479d-8cb1-d337a4407d64\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:51 crc kubenswrapper[4797]: I0930 17:59:51.336164 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 17:59:53 crc kubenswrapper[4797]: W0930 17:59:53.914488 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b64d922_5a17_4831_b87e_78ae0a9a9042.slice/crio-2dac7014833492f6a750f451d4e4276bb019ebcc071c178954abed217b53c520 WatchSource:0}: Error finding container 2dac7014833492f6a750f451d4e4276bb019ebcc071c178954abed217b53c520: Status 404 returned error can't find the container with id 2dac7014833492f6a750f451d4e4276bb019ebcc071c178954abed217b53c520 Sep 30 17:59:54 crc kubenswrapper[4797]: I0930 17:59:54.422819 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 17:59:54 crc kubenswrapper[4797]: I0930 17:59:54.874065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b64d922-5a17-4831-b87e-78ae0a9a9042","Type":"ContainerStarted","Data":"2dac7014833492f6a750f451d4e4276bb019ebcc071c178954abed217b53c520"} Sep 30 17:59:55 crc kubenswrapper[4797]: W0930 17:59:55.331360 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e728b8d_50b8_43df_bb8c_e3cbfce614e9.slice/crio-836fd3475bca4537eb73cacbc47f6be20e4241efd9c05ba298a2445b23f57df1 WatchSource:0}: Error finding container 836fd3475bca4537eb73cacbc47f6be20e4241efd9c05ba298a2445b23f57df1: Status 404 returned error can't find the container with id 836fd3475bca4537eb73cacbc47f6be20e4241efd9c05ba298a2445b23f57df1 Sep 30 17:59:55 crc kubenswrapper[4797]: I0930 17:59:55.881932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6e728b8d-50b8-43df-bb8c-e3cbfce614e9","Type":"ContainerStarted","Data":"836fd3475bca4537eb73cacbc47f6be20e4241efd9c05ba298a2445b23f57df1"} Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.222108 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.222346 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gv9pd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-jk6wp_openstack(8e53c2c0-a774-4a3e-accc-40524c99ea82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.223489 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" podUID="8e53c2c0-a774-4a3e-accc-40524c99ea82" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.293797 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.293954 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-44qfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-647hm_openstack(564b4715-8914-4183-b843-4d87fb41145a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.295456 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" podUID="564b4715-8914-4183-b843-4d87fb41145a" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.307458 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.307612 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dtxdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-qcpsp_openstack(6c5a18c6-a06e-4ef7-aa5a-078979990eee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:59:56 crc kubenswrapper[4797]: E0930 17:59:56.309014 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.750458 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.771520 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 17:59:56 crc kubenswrapper[4797]: W0930 17:59:56.776541 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc386be5e_6533_42b6_8a82_512c4c60cab2.slice/crio-90e16e97e69fe6079e98c08e66dac1a48a612a36d38b8f407775488a6049b1a0 WatchSource:0}: Error finding container 90e16e97e69fe6079e98c08e66dac1a48a612a36d38b8f407775488a6049b1a0: Status 404 returned error can't find the container with id 90e16e97e69fe6079e98c08e66dac1a48a612a36d38b8f407775488a6049b1a0 Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.778607 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nf4pk"] Sep 30 17:59:56 crc kubenswrapper[4797]: W0930 17:59:56.781211 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a527992_92f7_4aab_b8d4_e75ec72fd684.slice/crio-4c5eaaf3c5f219e674ce661d32987cd1d178404d7f04857c4cd7e4e1197f41c8 WatchSource:0}: Error finding container 4c5eaaf3c5f219e674ce661d32987cd1d178404d7f04857c4cd7e4e1197f41c8: Status 404 returned error can't find the container with id 4c5eaaf3c5f219e674ce661d32987cd1d178404d7f04857c4cd7e4e1197f41c8 Sep 30 17:59:56 crc kubenswrapper[4797]: W0930 17:59:56.782932 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433bd6ef_bdef_4c2b_9fb3_019fecae8b40.slice/crio-f675b79a7a7293f6cfd5c58b833e7ecebd856f9a61fdcbb5d20f75088205b2bf WatchSource:0}: Error finding container f675b79a7a7293f6cfd5c58b833e7ecebd856f9a61fdcbb5d20f75088205b2bf: Status 404 returned error can't find the container with id f675b79a7a7293f6cfd5c58b833e7ecebd856f9a61fdcbb5d20f75088205b2bf Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.892527 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerStarted","Data":"f675b79a7a7293f6cfd5c58b833e7ecebd856f9a61fdcbb5d20f75088205b2bf"} Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.895075 4797 generic.go:334] "Generic (PLEG): container finished" podID="9e788f55-44b8-4d71-b419-023ad236d45c" containerID="3a0cebdf71004c535d1e53fbec5c1a63de2092a9bc3cb6a3390da928948412e8" exitCode=0 Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.895331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" event={"ID":"9e788f55-44b8-4d71-b419-023ad236d45c","Type":"ContainerDied","Data":"3a0cebdf71004c535d1e53fbec5c1a63de2092a9bc3cb6a3390da928948412e8"} Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.897562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c386be5e-6533-42b6-8a82-512c4c60cab2","Type":"ContainerStarted","Data":"90e16e97e69fe6079e98c08e66dac1a48a612a36d38b8f407775488a6049b1a0"} Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.899494 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"65610b42-1ed9-4a27-996a-09e0ebd560e5","Type":"ContainerStarted","Data":"32d899fb0b4dcce53cc91d0ae0c0891dd0383d1706912339fb8c5a31d211f1b7"} Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.901256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a69c5e9-777c-48ad-8af7-78e770d2a9b2","Type":"ContainerStarted","Data":"ddea9aed82f65ba1df21799ed2ffbae0368261d2ae6cbde18121dc443bad437c"} Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.908904 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nf4pk" event={"ID":"4a527992-92f7-4aab-b8d4-e75ec72fd684","Type":"ContainerStarted","Data":"4c5eaaf3c5f219e674ce661d32987cd1d178404d7f04857c4cd7e4e1197f41c8"} Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.914960 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dqkv4"] Sep 30 17:59:56 crc kubenswrapper[4797]: I0930 17:59:56.962609 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 17:59:57 crc kubenswrapper[4797]: I0930 17:59:57.039482 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:59:58 crc kubenswrapper[4797]: I0930 17:59:58.068956 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:59:58 crc kubenswrapper[4797]: W0930 17:59:58.707828 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3e42915_c1cf_479d_8cb1_d337a4407d64.slice/crio-5b640034432e06589266d87a6a6e8f3420dbaab8cc59a9079a1020c92d528721 WatchSource:0}: Error finding container 5b640034432e06589266d87a6a6e8f3420dbaab8cc59a9079a1020c92d528721: Status 404 returned error can't find the container with id 5b640034432e06589266d87a6a6e8f3420dbaab8cc59a9079a1020c92d528721 Sep 30 17:59:58 crc kubenswrapper[4797]: I0930 17:59:58.927963 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a3e42915-c1cf-479d-8cb1-d337a4407d64","Type":"ContainerStarted","Data":"5b640034432e06589266d87a6a6e8f3420dbaab8cc59a9079a1020c92d528721"} Sep 30 17:59:59 crc kubenswrapper[4797]: W0930 17:59:59.350765 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2671b936_5121_4120_b39c_9686d92ed101.slice/crio-c68d069f1b76c17b476cfaead8627531cb447281cd0955682d22ee5c17cd4a41 WatchSource:0}: Error finding container c68d069f1b76c17b476cfaead8627531cb447281cd0955682d22ee5c17cd4a41: Status 404 returned error can't find the container with id c68d069f1b76c17b476cfaead8627531cb447281cd0955682d22ee5c17cd4a41 Sep 30 17:59:59 crc kubenswrapper[4797]: W0930 17:59:59.356969 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6270d3ec_66cb_42c0_97d5_a83f4fe7c854.slice/crio-ab53a8a6e20e8b1a477125c63163c59d13702b6df47a8c825c3e8c83349cca29 WatchSource:0}: Error finding container ab53a8a6e20e8b1a477125c63163c59d13702b6df47a8c825c3e8c83349cca29: Status 404 returned error can't find the container with id ab53a8a6e20e8b1a477125c63163c59d13702b6df47a8c825c3e8c83349cca29 Sep 30 17:59:59 crc kubenswrapper[4797]: W0930 17:59:59.361006 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df911ba_9b38_46e5_b779_3db695c839a9.slice/crio-b78f57a900642b2750093caa08e4eb7b9746d783d36abafa9723721644176c4e WatchSource:0}: Error finding container b78f57a900642b2750093caa08e4eb7b9746d783d36abafa9723721644176c4e: Status 404 returned error can't find the container with id b78f57a900642b2750093caa08e4eb7b9746d783d36abafa9723721644176c4e Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.483680 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.513062 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.534420 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qfl\" (UniqueName: \"kubernetes.io/projected/564b4715-8914-4183-b843-4d87fb41145a-kube-api-access-44qfl\") pod \"564b4715-8914-4183-b843-4d87fb41145a\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.534492 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564b4715-8914-4183-b843-4d87fb41145a-config\") pod \"564b4715-8914-4183-b843-4d87fb41145a\" (UID: \"564b4715-8914-4183-b843-4d87fb41145a\") " Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.534570 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv9pd\" (UniqueName: \"kubernetes.io/projected/8e53c2c0-a774-4a3e-accc-40524c99ea82-kube-api-access-gv9pd\") pod \"8e53c2c0-a774-4a3e-accc-40524c99ea82\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.534598 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-dns-svc\") pod \"8e53c2c0-a774-4a3e-accc-40524c99ea82\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.534633 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-config\") pod \"8e53c2c0-a774-4a3e-accc-40524c99ea82\" (UID: \"8e53c2c0-a774-4a3e-accc-40524c99ea82\") " Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.535362 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-config" (OuterVolumeSpecName: "config") pod "8e53c2c0-a774-4a3e-accc-40524c99ea82" (UID: "8e53c2c0-a774-4a3e-accc-40524c99ea82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.537303 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e53c2c0-a774-4a3e-accc-40524c99ea82" (UID: "8e53c2c0-a774-4a3e-accc-40524c99ea82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.537802 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564b4715-8914-4183-b843-4d87fb41145a-config" (OuterVolumeSpecName: "config") pod "564b4715-8914-4183-b843-4d87fb41145a" (UID: "564b4715-8914-4183-b843-4d87fb41145a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.538844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564b4715-8914-4183-b843-4d87fb41145a-kube-api-access-44qfl" (OuterVolumeSpecName: "kube-api-access-44qfl") pod "564b4715-8914-4183-b843-4d87fb41145a" (UID: "564b4715-8914-4183-b843-4d87fb41145a"). InnerVolumeSpecName "kube-api-access-44qfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.541298 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e53c2c0-a774-4a3e-accc-40524c99ea82-kube-api-access-gv9pd" (OuterVolumeSpecName: "kube-api-access-gv9pd") pod "8e53c2c0-a774-4a3e-accc-40524c99ea82" (UID: "8e53c2c0-a774-4a3e-accc-40524c99ea82"). InnerVolumeSpecName "kube-api-access-gv9pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.636776 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.636819 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e53c2c0-a774-4a3e-accc-40524c99ea82-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.636830 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qfl\" (UniqueName: \"kubernetes.io/projected/564b4715-8914-4183-b843-4d87fb41145a-kube-api-access-44qfl\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.636842 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564b4715-8914-4183-b843-4d87fb41145a-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.636851 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv9pd\" (UniqueName: \"kubernetes.io/projected/8e53c2c0-a774-4a3e-accc-40524c99ea82-kube-api-access-gv9pd\") on node \"crc\" DevicePath \"\"" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.938324 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c386be5e-6533-42b6-8a82-512c4c60cab2","Type":"ContainerStarted","Data":"54363b05d5fc8e7e426e92a85c019045dd80f65a9a74a1e2cb5d11be917a198b"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.939529 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df911ba-9b38-46e5-b779-3db695c839a9","Type":"ContainerStarted","Data":"b78f57a900642b2750093caa08e4eb7b9746d783d36abafa9723721644176c4e"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.941160 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerID="56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c" exitCode=0 Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.941211 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" event={"ID":"6c5a18c6-a06e-4ef7-aa5a-078979990eee","Type":"ContainerDied","Data":"56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.942960 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6e728b8d-50b8-43df-bb8c-e3cbfce614e9","Type":"ContainerStarted","Data":"ca1e7cd4b0025bc1683b3f9254ef58680510cc333a1e11d6798844b6161024a5"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.943063 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.944679 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dqkv4" event={"ID":"2671b936-5121-4120-b39c-9686d92ed101","Type":"ContainerStarted","Data":"c68d069f1b76c17b476cfaead8627531cb447281cd0955682d22ee5c17cd4a41"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.946377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" event={"ID":"8e53c2c0-a774-4a3e-accc-40524c99ea82","Type":"ContainerDied","Data":"ff9469db42bc100c9eb0304cf476218f5f75e683dc310d44122455f788312c8f"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.946403 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jk6wp" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.948125 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b64d922-5a17-4831-b87e-78ae0a9a9042","Type":"ContainerStarted","Data":"ed0a9f6f6ec12f1b78304afd90b64eaee90940981557081c7ef7d4fc4f2643a8"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.949946 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6270d3ec-66cb-42c0-97d5-a83f4fe7c854","Type":"ContainerStarted","Data":"ab53a8a6e20e8b1a477125c63163c59d13702b6df47a8c825c3e8c83349cca29"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.950750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" event={"ID":"564b4715-8914-4183-b843-4d87fb41145a","Type":"ContainerDied","Data":"b66cee8580afb9097459e7c8fb7e234a0332499efa898eb103fb64093d477d67"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.950801 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-647hm" Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.952259 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" event={"ID":"9e788f55-44b8-4d71-b419-023ad236d45c","Type":"ContainerStarted","Data":"97b5b330253b6b7cc103ae85db3cac071b3ba1d86f1bb1405b4908eb9e697091"} Sep 30 17:59:59 crc kubenswrapper[4797]: I0930 17:59:59.952708 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.025540 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.850300978 podStartE2EDuration="19.025519633s" podCreationTimestamp="2025-09-30 17:59:41 +0000 UTC" firstStartedPulling="2025-09-30 17:59:55.334991482 +0000 UTC m=+1045.857490760" lastFinishedPulling="2025-09-30 17:59:59.510210177 +0000 UTC m=+1050.032709415" observedRunningTime="2025-09-30 18:00:00.019645433 +0000 UTC m=+1050.542144671" watchObservedRunningTime="2025-09-30 18:00:00.025519633 +0000 UTC m=+1050.548018871" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.039283 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" podStartSLOduration=4.751967237 podStartE2EDuration="23.039262448s" podCreationTimestamp="2025-09-30 17:59:37 +0000 UTC" firstStartedPulling="2025-09-30 17:59:38.0657347 +0000 UTC m=+1028.588233938" lastFinishedPulling="2025-09-30 17:59:56.353029911 +0000 UTC m=+1046.875529149" observedRunningTime="2025-09-30 18:00:00.038330812 +0000 UTC m=+1050.560830050" watchObservedRunningTime="2025-09-30 18:00:00.039262448 +0000 UTC m=+1050.561761686" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.080967 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-647hm"] Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.081016 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-647hm"] Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.116752 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jk6wp"] Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.128766 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jk6wp"] Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.158096 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6"] Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.159288 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.161670 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.161862 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.174308 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6"] Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.258782 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b99a78e-2107-425b-8f49-3ac3621ba170-config-volume\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.258907 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwcp\" (UniqueName: \"kubernetes.io/projected/2b99a78e-2107-425b-8f49-3ac3621ba170-kube-api-access-2bwcp\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.258984 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b99a78e-2107-425b-8f49-3ac3621ba170-secret-volume\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.265629 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564b4715-8914-4183-b843-4d87fb41145a" path="/var/lib/kubelet/pods/564b4715-8914-4183-b843-4d87fb41145a/volumes" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.265969 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e53c2c0-a774-4a3e-accc-40524c99ea82" path="/var/lib/kubelet/pods/8e53c2c0-a774-4a3e-accc-40524c99ea82/volumes" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.361718 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwcp\" (UniqueName: \"kubernetes.io/projected/2b99a78e-2107-425b-8f49-3ac3621ba170-kube-api-access-2bwcp\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.361868 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b99a78e-2107-425b-8f49-3ac3621ba170-secret-volume\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.361936 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b99a78e-2107-425b-8f49-3ac3621ba170-config-volume\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.366718 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b99a78e-2107-425b-8f49-3ac3621ba170-config-volume\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.370916 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b99a78e-2107-425b-8f49-3ac3621ba170-secret-volume\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.383370 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwcp\" (UniqueName: \"kubernetes.io/projected/2b99a78e-2107-425b-8f49-3ac3621ba170-kube-api-access-2bwcp\") pod \"collect-profiles-29320920-2wfk6\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.484721 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.961314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" event={"ID":"6c5a18c6-a06e-4ef7-aa5a-078979990eee","Type":"ContainerStarted","Data":"f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f"} Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.962633 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 18:00:00 crc kubenswrapper[4797]: I0930 18:00:00.981545 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" podStartSLOduration=-9223372011.873245 podStartE2EDuration="24.981530108s" podCreationTimestamp="2025-09-30 17:59:36 +0000 UTC" firstStartedPulling="2025-09-30 17:59:37.76350621 +0000 UTC m=+1028.286005448" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:00.978278149 +0000 UTC m=+1051.500777397" watchObservedRunningTime="2025-09-30 18:00:00.981530108 +0000 UTC m=+1051.504029346" Sep 30 18:00:01 crc kubenswrapper[4797]: I0930 18:00:01.135126 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6"] Sep 30 18:00:02 crc kubenswrapper[4797]: W0930 18:00:02.384659 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b99a78e_2107_425b_8f49_3ac3621ba170.slice/crio-4357d6a2c2dae03f44564ffc84cc1d380391f8fed42d6aebba65a369e5cfb029 WatchSource:0}: Error finding container 4357d6a2c2dae03f44564ffc84cc1d380391f8fed42d6aebba65a369e5cfb029: Status 404 returned error can't find the container with id 4357d6a2c2dae03f44564ffc84cc1d380391f8fed42d6aebba65a369e5cfb029 Sep 30 18:00:02 crc kubenswrapper[4797]: I0930 18:00:02.977056 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" event={"ID":"2b99a78e-2107-425b-8f49-3ac3621ba170","Type":"ContainerStarted","Data":"4357d6a2c2dae03f44564ffc84cc1d380391f8fed42d6aebba65a369e5cfb029"} Sep 30 18:00:03 crc kubenswrapper[4797]: I0930 18:00:03.996913 4797 generic.go:334] "Generic (PLEG): container finished" podID="c386be5e-6533-42b6-8a82-512c4c60cab2" containerID="54363b05d5fc8e7e426e92a85c019045dd80f65a9a74a1e2cb5d11be917a198b" exitCode=0 Sep 30 18:00:03 crc kubenswrapper[4797]: I0930 18:00:03.997247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c386be5e-6533-42b6-8a82-512c4c60cab2","Type":"ContainerDied","Data":"54363b05d5fc8e7e426e92a85c019045dd80f65a9a74a1e2cb5d11be917a198b"} Sep 30 18:00:04 crc kubenswrapper[4797]: I0930 18:00:04.002879 4797 generic.go:334] "Generic (PLEG): container finished" podID="9b64d922-5a17-4831-b87e-78ae0a9a9042" containerID="ed0a9f6f6ec12f1b78304afd90b64eaee90940981557081c7ef7d4fc4f2643a8" exitCode=0 Sep 30 18:00:04 crc kubenswrapper[4797]: I0930 18:00:04.003065 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b64d922-5a17-4831-b87e-78ae0a9a9042","Type":"ContainerDied","Data":"ed0a9f6f6ec12f1b78304afd90b64eaee90940981557081c7ef7d4fc4f2643a8"} Sep 30 18:00:04 crc kubenswrapper[4797]: I0930 18:00:04.007198 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" event={"ID":"2b99a78e-2107-425b-8f49-3ac3621ba170","Type":"ContainerStarted","Data":"cdad9f8cbad5d86fb53aa2fac76142f7966dffdbf5539e0521dc55c0a848474d"} Sep 30 18:00:04 crc kubenswrapper[4797]: I0930 18:00:04.043279 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" podStartSLOduration=4.043259639 podStartE2EDuration="4.043259639s" podCreationTimestamp="2025-09-30 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:04.039238559 +0000 UTC m=+1054.561737807" watchObservedRunningTime="2025-09-30 18:00:04.043259639 +0000 UTC m=+1054.565758897" Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.016791 4797 generic.go:334] "Generic (PLEG): container finished" podID="2b99a78e-2107-425b-8f49-3ac3621ba170" containerID="cdad9f8cbad5d86fb53aa2fac76142f7966dffdbf5539e0521dc55c0a848474d" exitCode=0 Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.016899 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" event={"ID":"2b99a78e-2107-425b-8f49-3ac3621ba170","Type":"ContainerDied","Data":"cdad9f8cbad5d86fb53aa2fac76142f7966dffdbf5539e0521dc55c0a848474d"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.019575 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c386be5e-6533-42b6-8a82-512c4c60cab2","Type":"ContainerStarted","Data":"c9602edff86fcbe842372ea67d45ce100c02f9f17782383d340887db16798290"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.021475 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dqkv4" event={"ID":"2671b936-5121-4120-b39c-9686d92ed101","Type":"ContainerStarted","Data":"520af4d8af504cb77099c6d56d52e1ffaa2d61a40a955b7d9a4d8829a615f08d"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.024817 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b64d922-5a17-4831-b87e-78ae0a9a9042","Type":"ContainerStarted","Data":"aa2cc5555fc212473b1d2a10134cf96737a2687e042e3824e5ffef39ebdb6440"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.026276 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6270d3ec-66cb-42c0-97d5-a83f4fe7c854","Type":"ContainerStarted","Data":"e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.026390 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.027872 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a3e42915-c1cf-479d-8cb1-d337a4407d64","Type":"ContainerStarted","Data":"e1ac90b0c05f7a082e8b86f0b955fafc31f68971f143e5756e291e3edb76af4b"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.029313 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df911ba-9b38-46e5-b779-3db695c839a9","Type":"ContainerStarted","Data":"1d3d28c99aa769ee89c5080ba6f66b8501c36034805a18a5400bb5e7f6d42cc8"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.033190 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nf4pk" event={"ID":"4a527992-92f7-4aab-b8d4-e75ec72fd684","Type":"ContainerStarted","Data":"8e479bbc25b373b574cf8fd8cb30c85e3d0f00bebcea4536eb7137dfb20166a3"} Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.033826 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nf4pk" Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.051013 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.897636482 podStartE2EDuration="22.050996176s" podCreationTimestamp="2025-09-30 17:59:43 +0000 UTC" firstStartedPulling="2025-09-30 17:59:59.359653398 +0000 UTC m=+1049.882152636" lastFinishedPulling="2025-09-30 18:00:04.513013092 +0000 UTC m=+1055.035512330" observedRunningTime="2025-09-30 18:00:05.044121148 +0000 UTC m=+1055.566620386" watchObservedRunningTime="2025-09-30 18:00:05.050996176 +0000 UTC m=+1055.573495414" Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.095914 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.50224048 podStartE2EDuration="25.095898422s" podCreationTimestamp="2025-09-30 17:59:40 +0000 UTC" firstStartedPulling="2025-09-30 17:59:53.916861464 +0000 UTC m=+1044.439360712" lastFinishedPulling="2025-09-30 17:59:59.510519426 +0000 UTC m=+1050.033018654" observedRunningTime="2025-09-30 18:00:05.094278807 +0000 UTC m=+1055.616778045" watchObservedRunningTime="2025-09-30 18:00:05.095898422 +0000 UTC m=+1055.618397660" Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.124243 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.391308809 podStartE2EDuration="25.124219694s" podCreationTimestamp="2025-09-30 17:59:40 +0000 UTC" firstStartedPulling="2025-09-30 17:59:56.778999398 +0000 UTC m=+1047.301498636" lastFinishedPulling="2025-09-30 17:59:59.511910283 +0000 UTC m=+1050.034409521" observedRunningTime="2025-09-30 18:00:05.121535041 +0000 UTC m=+1055.644034299" watchObservedRunningTime="2025-09-30 18:00:05.124219694 +0000 UTC m=+1055.646718962" Sep 30 18:00:05 crc kubenswrapper[4797]: I0930 18:00:05.148002 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nf4pk" podStartSLOduration=11.333527128 podStartE2EDuration="18.147981533s" podCreationTimestamp="2025-09-30 17:59:47 +0000 UTC" firstStartedPulling="2025-09-30 17:59:56.7835164 +0000 UTC m=+1047.306015638" lastFinishedPulling="2025-09-30 18:00:03.597970795 +0000 UTC m=+1054.120470043" observedRunningTime="2025-09-30 18:00:05.138792242 +0000 UTC m=+1055.661291490" watchObservedRunningTime="2025-09-30 18:00:05.147981533 +0000 UTC m=+1055.670480781" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.041123 4797 generic.go:334] "Generic (PLEG): container finished" podID="2671b936-5121-4120-b39c-9686d92ed101" containerID="520af4d8af504cb77099c6d56d52e1ffaa2d61a40a955b7d9a4d8829a615f08d" exitCode=0 Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.041193 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dqkv4" event={"ID":"2671b936-5121-4120-b39c-9686d92ed101","Type":"ContainerDied","Data":"520af4d8af504cb77099c6d56d52e1ffaa2d61a40a955b7d9a4d8829a615f08d"} Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.365655 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.394151 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bwcp\" (UniqueName: \"kubernetes.io/projected/2b99a78e-2107-425b-8f49-3ac3621ba170-kube-api-access-2bwcp\") pod \"2b99a78e-2107-425b-8f49-3ac3621ba170\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.394280 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b99a78e-2107-425b-8f49-3ac3621ba170-secret-volume\") pod \"2b99a78e-2107-425b-8f49-3ac3621ba170\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.394357 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b99a78e-2107-425b-8f49-3ac3621ba170-config-volume\") pod \"2b99a78e-2107-425b-8f49-3ac3621ba170\" (UID: \"2b99a78e-2107-425b-8f49-3ac3621ba170\") " Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.395033 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b99a78e-2107-425b-8f49-3ac3621ba170-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b99a78e-2107-425b-8f49-3ac3621ba170" (UID: "2b99a78e-2107-425b-8f49-3ac3621ba170"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.399558 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b99a78e-2107-425b-8f49-3ac3621ba170-kube-api-access-2bwcp" (OuterVolumeSpecName: "kube-api-access-2bwcp") pod "2b99a78e-2107-425b-8f49-3ac3621ba170" (UID: "2b99a78e-2107-425b-8f49-3ac3621ba170"). InnerVolumeSpecName "kube-api-access-2bwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.399710 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b99a78e-2107-425b-8f49-3ac3621ba170-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b99a78e-2107-425b-8f49-3ac3621ba170" (UID: "2b99a78e-2107-425b-8f49-3ac3621ba170"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.495282 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bwcp\" (UniqueName: \"kubernetes.io/projected/2b99a78e-2107-425b-8f49-3ac3621ba170-kube-api-access-2bwcp\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.495536 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b99a78e-2107-425b-8f49-3ac3621ba170-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.495546 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b99a78e-2107-425b-8f49-3ac3621ba170-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:06 crc kubenswrapper[4797]: I0930 18:00:06.885625 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.057545 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dqkv4" event={"ID":"2671b936-5121-4120-b39c-9686d92ed101","Type":"ContainerStarted","Data":"b0357623a16f4fe397251e0c9d3655aaf4b73969db1fcf9dc04e21831562821c"} Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.057587 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dqkv4" event={"ID":"2671b936-5121-4120-b39c-9686d92ed101","Type":"ContainerStarted","Data":"2770659c56973d4a135bf681a4fc54e12f87ef25b63b1f221760d4e9386da2d7"} Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.057698 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.061040 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.061033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6" event={"ID":"2b99a78e-2107-425b-8f49-3ac3621ba170","Type":"ContainerDied","Data":"4357d6a2c2dae03f44564ffc84cc1d380391f8fed42d6aebba65a369e5cfb029"} Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.061084 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4357d6a2c2dae03f44564ffc84cc1d380391f8fed42d6aebba65a369e5cfb029" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.063499 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerStarted","Data":"2e37114ce23ab91d58d22daaf5fcdf98ae15ccc588feae39c9852b4c5efa4111"} Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.080138 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dqkv4" podStartSLOduration=15.92220055 podStartE2EDuration="20.080123652s" podCreationTimestamp="2025-09-30 17:59:47 +0000 UTC" firstStartedPulling="2025-09-30 17:59:59.353305385 +0000 UTC m=+1049.875804623" lastFinishedPulling="2025-09-30 18:00:03.511228487 +0000 UTC m=+1054.033727725" observedRunningTime="2025-09-30 18:00:07.075123565 +0000 UTC m=+1057.597622803" watchObservedRunningTime="2025-09-30 18:00:07.080123652 +0000 UTC m=+1057.602622880" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.249614 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.522587 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 18:00:07 crc kubenswrapper[4797]: I0930 18:00:07.573043 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qcpsp"] Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.071079 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.071478 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerName="dnsmasq-dns" containerID="cri-o://f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f" gracePeriod=10 Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.548350 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.641639 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-dns-svc\") pod \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.642619 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-config\") pod \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.642759 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtxdt\" (UniqueName: \"kubernetes.io/projected/6c5a18c6-a06e-4ef7-aa5a-078979990eee-kube-api-access-dtxdt\") pod \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\" (UID: \"6c5a18c6-a06e-4ef7-aa5a-078979990eee\") " Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.677597 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5a18c6-a06e-4ef7-aa5a-078979990eee-kube-api-access-dtxdt" (OuterVolumeSpecName: "kube-api-access-dtxdt") pod "6c5a18c6-a06e-4ef7-aa5a-078979990eee" (UID: "6c5a18c6-a06e-4ef7-aa5a-078979990eee"). InnerVolumeSpecName "kube-api-access-dtxdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.698104 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-config" (OuterVolumeSpecName: "config") pod "6c5a18c6-a06e-4ef7-aa5a-078979990eee" (UID: "6c5a18c6-a06e-4ef7-aa5a-078979990eee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.701895 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c5a18c6-a06e-4ef7-aa5a-078979990eee" (UID: "6c5a18c6-a06e-4ef7-aa5a-078979990eee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.745396 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.745445 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtxdt\" (UniqueName: \"kubernetes.io/projected/6c5a18c6-a06e-4ef7-aa5a-078979990eee-kube-api-access-dtxdt\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:08 crc kubenswrapper[4797]: I0930 18:00:08.745461 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5a18c6-a06e-4ef7-aa5a-078979990eee-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.079032 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerID="f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f" exitCode=0 Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.079064 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" event={"ID":"6c5a18c6-a06e-4ef7-aa5a-078979990eee","Type":"ContainerDied","Data":"f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f"} Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.080407 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" event={"ID":"6c5a18c6-a06e-4ef7-aa5a-078979990eee","Type":"ContainerDied","Data":"482d0d18d44ec1b3b0d2e569aaa425d2b65826ca281dc4e59ed4b3063cc5b717"} Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.079319 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qcpsp" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.080464 4797 scope.go:117] "RemoveContainer" containerID="f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.082851 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a3e42915-c1cf-479d-8cb1-d337a4407d64","Type":"ContainerStarted","Data":"9171bc199214eba9b09485a80be8165d04db6446729be034d36c2e872982d0b0"} Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.084591 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df911ba-9b38-46e5-b779-3db695c839a9","Type":"ContainerStarted","Data":"6a6ccdd40392ab9206ad0bdd7f970821846eb40a72c197fd0093e7560ba7f90c"} Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.113885 4797 scope.go:117] "RemoveContainer" containerID="56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.130891 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.257933347 podStartE2EDuration="21.130869238s" podCreationTimestamp="2025-09-30 17:59:48 +0000 UTC" firstStartedPulling="2025-09-30 17:59:59.389294417 +0000 UTC m=+1049.911793645" lastFinishedPulling="2025-09-30 18:00:08.262230298 +0000 UTC m=+1058.784729536" observedRunningTime="2025-09-30 18:00:09.101186577 +0000 UTC m=+1059.623685825" watchObservedRunningTime="2025-09-30 18:00:09.130869238 +0000 UTC m=+1059.653368476" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.139034 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.575615593 podStartE2EDuration="20.138418744s" podCreationTimestamp="2025-09-30 17:59:49 +0000 UTC" firstStartedPulling="2025-09-30 17:59:58.712883534 +0000 UTC m=+1049.235382782" lastFinishedPulling="2025-09-30 18:00:08.275686705 +0000 UTC m=+1058.798185933" observedRunningTime="2025-09-30 18:00:09.123033364 +0000 UTC m=+1059.645532602" watchObservedRunningTime="2025-09-30 18:00:09.138418744 +0000 UTC m=+1059.660917982" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.139375 4797 scope.go:117] "RemoveContainer" containerID="f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f" Sep 30 18:00:09 crc kubenswrapper[4797]: E0930 18:00:09.140000 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f\": container with ID starting with f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f not found: ID does not exist" containerID="f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.140042 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f"} err="failed to get container status \"f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f\": rpc error: code = NotFound desc = could not find container \"f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f\": container with ID starting with f04dac2bf0db38d1c6cf2ab158171c41e57d90cac5433eeb16c5ac11975dda4f not found: ID does not exist" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.140067 4797 scope.go:117] "RemoveContainer" containerID="56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c" Sep 30 18:00:09 crc kubenswrapper[4797]: E0930 18:00:09.140493 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c\": container with ID starting with 56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c not found: ID does not exist" containerID="56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.140532 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c"} err="failed to get container status \"56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c\": rpc error: code = NotFound desc = could not find container \"56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c\": container with ID starting with 56a1c31c24c04566874a20febe891c13d90ddff4bae60f0bdd0871b03e06e57c not found: ID does not exist" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.150519 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qcpsp"] Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.157838 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qcpsp"] Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.337234 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.373697 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 18:00:09 crc kubenswrapper[4797]: I0930 18:00:09.681659 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.093254 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.153098 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.248678 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" path="/var/lib/kubelet/pods/6c5a18c6-a06e-4ef7-aa5a-078979990eee/volumes" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.459605 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-5599j"] Sep 30 18:00:10 crc kubenswrapper[4797]: E0930 18:00:10.460184 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerName="init" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.460200 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerName="init" Sep 30 18:00:10 crc kubenswrapper[4797]: E0930 18:00:10.460226 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerName="dnsmasq-dns" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.460233 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerName="dnsmasq-dns" Sep 30 18:00:10 crc kubenswrapper[4797]: E0930 18:00:10.460242 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b99a78e-2107-425b-8f49-3ac3621ba170" containerName="collect-profiles" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.460248 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b99a78e-2107-425b-8f49-3ac3621ba170" containerName="collect-profiles" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.460409 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5a18c6-a06e-4ef7-aa5a-078979990eee" containerName="dnsmasq-dns" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.460446 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b99a78e-2107-425b-8f49-3ac3621ba170" containerName="collect-profiles" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.461259 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.472142 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.474342 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-5599j"] Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.519658 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8bw4c"] Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.520858 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.522774 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.537047 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8bw4c"] Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.576075 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5w5c\" (UniqueName: \"kubernetes.io/projected/a69c5e33-b00b-497f-bf7d-3b59593b3aca-kube-api-access-l5w5c\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.576297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-config\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.576358 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.576415 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.669542 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-5599j"] Sep 30 18:00:10 crc kubenswrapper[4797]: E0930 18:00:10.670227 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-l5w5c ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-5599j" podUID="a69c5e33-b00b-497f-bf7d-3b59593b3aca" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678284 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-config\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678332 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678359 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828f5c5e-04c9-49c0-8056-7c930e756a44-config\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678377 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678398 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/828f5c5e-04c9-49c0-8056-7c930e756a44-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678633 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkjs\" (UniqueName: \"kubernetes.io/projected/828f5c5e-04c9-49c0-8056-7c930e756a44-kube-api-access-7mkjs\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678722 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5w5c\" (UniqueName: \"kubernetes.io/projected/a69c5e33-b00b-497f-bf7d-3b59593b3aca-kube-api-access-l5w5c\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678871 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f5c5e-04c9-49c0-8056-7c930e756a44-combined-ca-bundle\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/828f5c5e-04c9-49c0-8056-7c930e756a44-ovn-rundir\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.678964 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/828f5c5e-04c9-49c0-8056-7c930e756a44-ovs-rundir\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.679202 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-config\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.679251 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.679317 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.680763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.691109 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-72t8v"] Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.692582 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.701394 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.707202 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5w5c\" (UniqueName: \"kubernetes.io/projected/a69c5e33-b00b-497f-bf7d-3b59593b3aca-kube-api-access-l5w5c\") pod \"dnsmasq-dns-7f896c8c65-5599j\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.718069 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-72t8v"] Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.740549 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780106 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780155 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780189 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-config\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780211 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780230 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8fg\" (UniqueName: \"kubernetes.io/projected/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-kube-api-access-rz8fg\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780278 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f5c5e-04c9-49c0-8056-7c930e756a44-combined-ca-bundle\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/828f5c5e-04c9-49c0-8056-7c930e756a44-ovn-rundir\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780326 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/828f5c5e-04c9-49c0-8056-7c930e756a44-ovs-rundir\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780366 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828f5c5e-04c9-49c0-8056-7c930e756a44-config\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780386 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/828f5c5e-04c9-49c0-8056-7c930e756a44-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780448 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkjs\" (UniqueName: \"kubernetes.io/projected/828f5c5e-04c9-49c0-8056-7c930e756a44-kube-api-access-7mkjs\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780631 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/828f5c5e-04c9-49c0-8056-7c930e756a44-ovn-rundir\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.780976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/828f5c5e-04c9-49c0-8056-7c930e756a44-ovs-rundir\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.784130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f5c5e-04c9-49c0-8056-7c930e756a44-combined-ca-bundle\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.785254 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828f5c5e-04c9-49c0-8056-7c930e756a44-config\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.785756 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/828f5c5e-04c9-49c0-8056-7c930e756a44-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.795348 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkjs\" (UniqueName: \"kubernetes.io/projected/828f5c5e-04c9-49c0-8056-7c930e756a44-kube-api-access-7mkjs\") pod \"ovn-controller-metrics-8bw4c\" (UID: \"828f5c5e-04c9-49c0-8056-7c930e756a44\") " pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.834421 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8bw4c" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.883970 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.884022 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.884046 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-config\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.884064 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.884080 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8fg\" (UniqueName: \"kubernetes.io/projected/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-kube-api-access-rz8fg\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.885095 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.885198 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-config\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.885674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.886002 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:10 crc kubenswrapper[4797]: I0930 18:00:10.902205 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8fg\" (UniqueName: \"kubernetes.io/projected/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-kube-api-access-rz8fg\") pod \"dnsmasq-dns-86db49b7ff-72t8v\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.079791 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.100014 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.111891 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.168165 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.191418 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-ovsdbserver-sb\") pod \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.191562 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-config\") pod \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.191618 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-dns-svc\") pod \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.191653 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5w5c\" (UniqueName: \"kubernetes.io/projected/a69c5e33-b00b-497f-bf7d-3b59593b3aca-kube-api-access-l5w5c\") pod \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\" (UID: \"a69c5e33-b00b-497f-bf7d-3b59593b3aca\") " Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.192281 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a69c5e33-b00b-497f-bf7d-3b59593b3aca" (UID: "a69c5e33-b00b-497f-bf7d-3b59593b3aca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.193112 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-config" (OuterVolumeSpecName: "config") pod "a69c5e33-b00b-497f-bf7d-3b59593b3aca" (UID: "a69c5e33-b00b-497f-bf7d-3b59593b3aca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.193395 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a69c5e33-b00b-497f-bf7d-3b59593b3aca" (UID: "a69c5e33-b00b-497f-bf7d-3b59593b3aca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.200053 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69c5e33-b00b-497f-bf7d-3b59593b3aca-kube-api-access-l5w5c" (OuterVolumeSpecName: "kube-api-access-l5w5c") pod "a69c5e33-b00b-497f-bf7d-3b59593b3aca" (UID: "a69c5e33-b00b-497f-bf7d-3b59593b3aca"). InnerVolumeSpecName "kube-api-access-l5w5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.272590 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8bw4c"] Sep 30 18:00:11 crc kubenswrapper[4797]: W0930 18:00:11.280501 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod828f5c5e_04c9_49c0_8056_7c930e756a44.slice/crio-5f6ad83d8b57299f9f2d4023bd75a95185be8746008fafc921130d5b542572e2 WatchSource:0}: Error finding container 5f6ad83d8b57299f9f2d4023bd75a95185be8746008fafc921130d5b542572e2: Status 404 returned error can't find the container with id 5f6ad83d8b57299f9f2d4023bd75a95185be8746008fafc921130d5b542572e2 Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.294680 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.294722 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.294738 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69c5e33-b00b-497f-bf7d-3b59593b3aca-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.294751 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5w5c\" (UniqueName: \"kubernetes.io/projected/a69c5e33-b00b-497f-bf7d-3b59593b3aca-kube-api-access-l5w5c\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.416100 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.421908 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.425930 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.426747 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.427021 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.427141 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.427308 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-458z8" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.482338 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.482393 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497764 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497812 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497854 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3139ddf0-f590-40e2-bd15-0af615d5cbf1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497903 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3139ddf0-f590-40e2-bd15-0af615d5cbf1-scripts\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497917 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxhd9\" (UniqueName: \"kubernetes.io/projected/3139ddf0-f590-40e2-bd15-0af615d5cbf1-kube-api-access-kxhd9\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.497938 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3139ddf0-f590-40e2-bd15-0af615d5cbf1-config\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.542282 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-72t8v"] Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.543118 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 18:00:11 crc kubenswrapper[4797]: W0930 18:00:11.548456 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b471ec_3bb7_4cee_9ab1_8ea7a2d0619b.slice/crio-488896e54cc52d94a46d765a7e10f4a3f79ffd3a5e390f1ca7100d7b628fecb0 WatchSource:0}: Error finding container 488896e54cc52d94a46d765a7e10f4a3f79ffd3a5e390f1ca7100d7b628fecb0: Status 404 returned error can't find the container with id 488896e54cc52d94a46d765a7e10f4a3f79ffd3a5e390f1ca7100d7b628fecb0 Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.599691 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.606405 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3139ddf0-f590-40e2-bd15-0af615d5cbf1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.606583 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3139ddf0-f590-40e2-bd15-0af615d5cbf1-scripts\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.606611 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxhd9\" (UniqueName: \"kubernetes.io/projected/3139ddf0-f590-40e2-bd15-0af615d5cbf1-kube-api-access-kxhd9\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.606668 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3139ddf0-f590-40e2-bd15-0af615d5cbf1-config\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.606724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.606774 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.607305 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3139ddf0-f590-40e2-bd15-0af615d5cbf1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.607660 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3139ddf0-f590-40e2-bd15-0af615d5cbf1-scripts\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.608479 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3139ddf0-f590-40e2-bd15-0af615d5cbf1-config\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.611277 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.612277 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.613668 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.613705 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.614289 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139ddf0-f590-40e2-bd15-0af615d5cbf1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.628980 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxhd9\" (UniqueName: \"kubernetes.io/projected/3139ddf0-f590-40e2-bd15-0af615d5cbf1-kube-api-access-kxhd9\") pod \"ovn-northd-0\" (UID: \"3139ddf0-f590-40e2-bd15-0af615d5cbf1\") " pod="openstack/ovn-northd-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.672143 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 18:00:11 crc kubenswrapper[4797]: I0930 18:00:11.770687 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.108781 4797 generic.go:334] "Generic (PLEG): container finished" podID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerID="3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b" exitCode=0 Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.108859 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" event={"ID":"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b","Type":"ContainerDied","Data":"3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b"} Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.108891 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" event={"ID":"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b","Type":"ContainerStarted","Data":"488896e54cc52d94a46d765a7e10f4a3f79ffd3a5e390f1ca7100d7b628fecb0"} Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.110452 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-5599j" Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.110460 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8bw4c" event={"ID":"828f5c5e-04c9-49c0-8056-7c930e756a44","Type":"ContainerStarted","Data":"97707b7daf6b0174f43a56d08e7419bf7a20737e0ce82c7b9307497a24ef4bb6"} Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.110509 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8bw4c" event={"ID":"828f5c5e-04c9-49c0-8056-7c930e756a44","Type":"ContainerStarted","Data":"5f6ad83d8b57299f9f2d4023bd75a95185be8746008fafc921130d5b542572e2"} Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.153826 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8bw4c" podStartSLOduration=2.15381174 podStartE2EDuration="2.15381174s" podCreationTimestamp="2025-09-30 18:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:12.15304017 +0000 UTC m=+1062.675539408" watchObservedRunningTime="2025-09-30 18:00:12.15381174 +0000 UTC m=+1062.676310978" Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.175145 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.223922 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.278505 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.318740 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-5599j"] Sep 30 18:00:12 crc kubenswrapper[4797]: I0930 18:00:12.345789 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-5599j"] Sep 30 18:00:13 crc kubenswrapper[4797]: I0930 18:00:13.129025 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3139ddf0-f590-40e2-bd15-0af615d5cbf1","Type":"ContainerStarted","Data":"38ca522ff4f67535393985e2efeb6180c00f6a8290d6147535f88c8b3beee88f"} Sep 30 18:00:13 crc kubenswrapper[4797]: I0930 18:00:13.131946 4797 generic.go:334] "Generic (PLEG): container finished" podID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerID="2e37114ce23ab91d58d22daaf5fcdf98ae15ccc588feae39c9852b4c5efa4111" exitCode=0 Sep 30 18:00:13 crc kubenswrapper[4797]: I0930 18:00:13.132081 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerDied","Data":"2e37114ce23ab91d58d22daaf5fcdf98ae15ccc588feae39c9852b4c5efa4111"} Sep 30 18:00:13 crc kubenswrapper[4797]: I0930 18:00:13.141370 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" event={"ID":"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b","Type":"ContainerStarted","Data":"69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691"} Sep 30 18:00:13 crc kubenswrapper[4797]: I0930 18:00:13.210998 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" podStartSLOduration=3.210975496 podStartE2EDuration="3.210975496s" podCreationTimestamp="2025-09-30 18:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:13.189309095 +0000 UTC m=+1063.711808413" watchObservedRunningTime="2025-09-30 18:00:13.210975496 +0000 UTC m=+1063.733474744" Sep 30 18:00:13 crc kubenswrapper[4797]: I0930 18:00:13.827085 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.130087 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-dfjp2"] Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.136372 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.153499 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-72t8v"] Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.166203 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-dfjp2"] Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.175199 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3139ddf0-f590-40e2-bd15-0af615d5cbf1","Type":"ContainerStarted","Data":"934fa3fb7a72f678937fc2d35e27b76ce31de617a4b0b7462c21deed66613743"} Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.175418 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.203332 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-4hzw4"] Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.218923 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.234554 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4hzw4"] Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.258444 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69c5e33-b00b-497f-bf7d-3b59593b3aca" path="/var/lib/kubelet/pods/a69c5e33-b00b-497f-bf7d-3b59593b3aca/volumes" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.266562 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9f7\" (UniqueName: \"kubernetes.io/projected/efe9cbac-3168-4887-81ea-51d04c2a70c8-kube-api-access-fh9f7\") pod \"watcher-db-create-dfjp2\" (UID: \"efe9cbac-3168-4887-81ea-51d04c2a70c8\") " pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.368165 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-config\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.368239 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.368278 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtp9\" (UniqueName: \"kubernetes.io/projected/4d5a74b4-24b3-4369-bda9-4de7e98d9821-kube-api-access-nmtp9\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.368312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9f7\" (UniqueName: \"kubernetes.io/projected/efe9cbac-3168-4887-81ea-51d04c2a70c8-kube-api-access-fh9f7\") pod \"watcher-db-create-dfjp2\" (UID: \"efe9cbac-3168-4887-81ea-51d04c2a70c8\") " pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.368331 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-dns-svc\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.368376 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.384701 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9f7\" (UniqueName: \"kubernetes.io/projected/efe9cbac-3168-4887-81ea-51d04c2a70c8-kube-api-access-fh9f7\") pod \"watcher-db-create-dfjp2\" (UID: \"efe9cbac-3168-4887-81ea-51d04c2a70c8\") " pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.469632 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.469726 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-config\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.469792 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.469839 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtp9\" (UniqueName: \"kubernetes.io/projected/4d5a74b4-24b3-4369-bda9-4de7e98d9821-kube-api-access-nmtp9\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.469873 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-dns-svc\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.470638 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.470852 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-dns-svc\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.470850 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.471785 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-config\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.478706 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.487359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtp9\" (UniqueName: \"kubernetes.io/projected/4d5a74b4-24b3-4369-bda9-4de7e98d9821-kube-api-access-nmtp9\") pod \"dnsmasq-dns-698758b865-4hzw4\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.546837 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:14 crc kubenswrapper[4797]: W0930 18:00:14.973860 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefe9cbac_3168_4887_81ea_51d04c2a70c8.slice/crio-767f28db3e094b1e1e477e3e66f95ceb1e92f8dc4d0d4c09dc4d365c645d912f WatchSource:0}: Error finding container 767f28db3e094b1e1e477e3e66f95ceb1e92f8dc4d0d4c09dc4d365c645d912f: Status 404 returned error can't find the container with id 767f28db3e094b1e1e477e3e66f95ceb1e92f8dc4d0d4c09dc4d365c645d912f Sep 30 18:00:14 crc kubenswrapper[4797]: I0930 18:00:14.974425 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-dfjp2"] Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.021565 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4hzw4"] Sep 30 18:00:15 crc kubenswrapper[4797]: W0930 18:00:15.031444 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d5a74b4_24b3_4369_bda9_4de7e98d9821.slice/crio-6e1d642a8fca3e1d5b1b2d731beb4b4795762d544ff3a635b8273f0f32a1e2bb WatchSource:0}: Error finding container 6e1d642a8fca3e1d5b1b2d731beb4b4795762d544ff3a635b8273f0f32a1e2bb: Status 404 returned error can't find the container with id 6e1d642a8fca3e1d5b1b2d731beb4b4795762d544ff3a635b8273f0f32a1e2bb Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.184781 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4hzw4" event={"ID":"4d5a74b4-24b3-4369-bda9-4de7e98d9821","Type":"ContainerStarted","Data":"6e1d642a8fca3e1d5b1b2d731beb4b4795762d544ff3a635b8273f0f32a1e2bb"} Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.186794 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dfjp2" event={"ID":"efe9cbac-3168-4887-81ea-51d04c2a70c8","Type":"ContainerStarted","Data":"767f28db3e094b1e1e477e3e66f95ceb1e92f8dc4d0d4c09dc4d365c645d912f"} Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.189089 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerName="dnsmasq-dns" containerID="cri-o://69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691" gracePeriod=10 Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.189283 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3139ddf0-f590-40e2-bd15-0af615d5cbf1","Type":"ContainerStarted","Data":"aa35d04e215a4feaffa72eb93f4bf08b4e7953ebae95cb9f8f8edd6da4c61b34"} Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.189750 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.229387 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.8862939389999998 podStartE2EDuration="4.229369069s" podCreationTimestamp="2025-09-30 18:00:11 +0000 UTC" firstStartedPulling="2025-09-30 18:00:12.280271262 +0000 UTC m=+1062.802770490" lastFinishedPulling="2025-09-30 18:00:13.623346382 +0000 UTC m=+1064.145845620" observedRunningTime="2025-09-30 18:00:15.217809784 +0000 UTC m=+1065.740309022" watchObservedRunningTime="2025-09-30 18:00:15.229369069 +0000 UTC m=+1065.751868307" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.241290 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.287498 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.287637 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.293026 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hdnbf" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.293032 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.293082 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.293262 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.387476 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fecceec-298d-4979-b468-5fe35c9b68e7-lock\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.387601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.387643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.387842 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fecceec-298d-4979-b468-5fe35c9b68e7-cache\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.387953 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsvl\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-kube-api-access-svsvl\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.466272 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefe9cbac_3168_4887_81ea_51d04c2a70c8.slice/crio-5555af5f93fa7e03416c149d5d3487373a712ed327c8aa9534586239983e04ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefe9cbac_3168_4887_81ea_51d04c2a70c8.slice/crio-conmon-5555af5f93fa7e03416c149d5d3487373a712ed327c8aa9534586239983e04ab.scope\": RecentStats: unable to find data in memory cache]" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.489037 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fecceec-298d-4979-b468-5fe35c9b68e7-cache\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.489116 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsvl\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-kube-api-access-svsvl\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.489159 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fecceec-298d-4979-b468-5fe35c9b68e7-lock\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.489200 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.489327 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.489339 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.489386 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift podName:4fecceec-298d-4979-b468-5fe35c9b68e7 nodeName:}" failed. No retries permitted until 2025-09-30 18:00:15.989366446 +0000 UTC m=+1066.511865684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift") pod "swift-storage-0" (UID: "4fecceec-298d-4979-b468-5fe35c9b68e7") : configmap "swift-ring-files" not found Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.489535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.489930 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.490813 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fecceec-298d-4979-b468-5fe35c9b68e7-lock\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.491042 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fecceec-298d-4979-b468-5fe35c9b68e7-cache\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.519703 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsvl\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-kube-api-access-svsvl\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.527462 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.600511 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.692147 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-nb\") pod \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.692874 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-dns-svc\") pod \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.692938 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-config\") pod \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.692981 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz8fg\" (UniqueName: \"kubernetes.io/projected/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-kube-api-access-rz8fg\") pod \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.693030 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-sb\") pod \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\" (UID: \"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b\") " Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.704869 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-kube-api-access-rz8fg" (OuterVolumeSpecName: "kube-api-access-rz8fg") pod "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" (UID: "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b"). InnerVolumeSpecName "kube-api-access-rz8fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.747246 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-config" (OuterVolumeSpecName: "config") pod "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" (UID: "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.768411 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s8zjg"] Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.768804 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerName="dnsmasq-dns" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.768815 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerName="dnsmasq-dns" Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.768823 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerName="init" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.768830 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerName="init" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.769019 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerName="dnsmasq-dns" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.769566 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.772734 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.772889 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.774651 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" (UID: "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.775619 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" (UID: "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.781690 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.786425 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" (UID: "68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.786998 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s8zjg"] Sep 30 18:00:15 crc kubenswrapper[4797]: E0930 18:00:15.787766 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-trbj4 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-trbj4 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-s8zjg" podUID="f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.794821 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.795145 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.795284 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.795371 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz8fg\" (UniqueName: \"kubernetes.io/projected/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-kube-api-access-rz8fg\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.795464 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.795110 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cx5sg"] Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.800066 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.811328 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cx5sg"] Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.830045 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s8zjg"] Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897211 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-dispersionconf\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897276 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-combined-ca-bundle\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897342 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-dispersionconf\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897361 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-combined-ca-bundle\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897385 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzgl\" (UniqueName: \"kubernetes.io/projected/139c3278-4f30-418b-ae01-2ea9ac63ab55-kube-api-access-4zzgl\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897405 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-scripts\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897452 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-ring-data-devices\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-swiftconf\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897511 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-etc-swift\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897536 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-swiftconf\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897661 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbj4\" (UniqueName: \"kubernetes.io/projected/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-kube-api-access-trbj4\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897742 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-ring-data-devices\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897804 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-scripts\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:15 crc kubenswrapper[4797]: I0930 18:00:15.897959 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/139c3278-4f30-418b-ae01-2ea9ac63ab55-etc-swift\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:15.999862 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/139c3278-4f30-418b-ae01-2ea9ac63ab55-etc-swift\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:15.999933 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-dispersionconf\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:15.999974 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-combined-ca-bundle\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000040 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-dispersionconf\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000063 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-combined-ca-bundle\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000095 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzgl\" (UniqueName: \"kubernetes.io/projected/139c3278-4f30-418b-ae01-2ea9ac63ab55-kube-api-access-4zzgl\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000124 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-scripts\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000156 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-ring-data-devices\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000181 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-swiftconf\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000207 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000241 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-etc-swift\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000274 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-swiftconf\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000301 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbj4\" (UniqueName: \"kubernetes.io/projected/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-kube-api-access-trbj4\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000325 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-ring-data-devices\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.000347 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-scripts\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.001109 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-scripts\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.001383 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/139c3278-4f30-418b-ae01-2ea9ac63ab55-etc-swift\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.002542 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-ring-data-devices\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.002838 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-ring-data-devices\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.003148 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-etc-swift\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: E0930 18:00:16.003298 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 18:00:16 crc kubenswrapper[4797]: E0930 18:00:16.003319 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 18:00:16 crc kubenswrapper[4797]: E0930 18:00:16.003397 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift podName:4fecceec-298d-4979-b468-5fe35c9b68e7 nodeName:}" failed. No retries permitted until 2025-09-30 18:00:17.003378426 +0000 UTC m=+1067.525877674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift") pod "swift-storage-0" (UID: "4fecceec-298d-4979-b468-5fe35c9b68e7") : configmap "swift-ring-files" not found Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.003904 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-scripts\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.005161 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-swiftconf\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.006707 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-dispersionconf\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.013476 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-dispersionconf\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.015850 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-swiftconf\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.016907 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbj4\" (UniqueName: \"kubernetes.io/projected/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-kube-api-access-trbj4\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.017910 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-combined-ca-bundle\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.020072 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzgl\" (UniqueName: \"kubernetes.io/projected/139c3278-4f30-418b-ae01-2ea9ac63ab55-kube-api-access-4zzgl\") pod \"swift-ring-rebalance-cx5sg\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.025140 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-combined-ca-bundle\") pod \"swift-ring-rebalance-s8zjg\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.130106 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.211848 4797 generic.go:334] "Generic (PLEG): container finished" podID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" containerID="69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691" exitCode=0 Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.211901 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.211940 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" event={"ID":"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b","Type":"ContainerDied","Data":"69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691"} Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.211966 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-72t8v" event={"ID":"68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b","Type":"ContainerDied","Data":"488896e54cc52d94a46d765a7e10f4a3f79ffd3a5e390f1ca7100d7b628fecb0"} Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.211986 4797 scope.go:117] "RemoveContainer" containerID="69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.227474 4797 generic.go:334] "Generic (PLEG): container finished" podID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerID="43f46b1243691b0907b8cac380cab5f530b7bbfce39aa59afb43e44cfd1d3db1" exitCode=0 Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.228188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4hzw4" event={"ID":"4d5a74b4-24b3-4369-bda9-4de7e98d9821","Type":"ContainerDied","Data":"43f46b1243691b0907b8cac380cab5f530b7bbfce39aa59afb43e44cfd1d3db1"} Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.248275 4797 generic.go:334] "Generic (PLEG): container finished" podID="efe9cbac-3168-4887-81ea-51d04c2a70c8" containerID="5555af5f93fa7e03416c149d5d3487373a712ed327c8aa9534586239983e04ab" exitCode=0 Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.249587 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.274874 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.293760 4797 scope.go:117] "RemoveContainer" containerID="3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.300489 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dfjp2" event={"ID":"efe9cbac-3168-4887-81ea-51d04c2a70c8","Type":"ContainerDied","Data":"5555af5f93fa7e03416c149d5d3487373a712ed327c8aa9534586239983e04ab"} Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.300523 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-72t8v"] Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.300538 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-72t8v"] Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.328302 4797 scope.go:117] "RemoveContainer" containerID="69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691" Sep 30 18:00:16 crc kubenswrapper[4797]: E0930 18:00:16.328894 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691\": container with ID starting with 69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691 not found: ID does not exist" containerID="69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.328948 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691"} err="failed to get container status \"69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691\": rpc error: code = NotFound desc = could not find container \"69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691\": container with ID starting with 69c49f743f1b4250d8cec9b15c6c083822bf21912dd57a23f97f9296db67d691 not found: ID does not exist" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.328972 4797 scope.go:117] "RemoveContainer" containerID="3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b" Sep 30 18:00:16 crc kubenswrapper[4797]: E0930 18:00:16.329464 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b\": container with ID starting with 3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b not found: ID does not exist" containerID="3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.329481 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b"} err="failed to get container status \"3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b\": rpc error: code = NotFound desc = could not find container \"3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b\": container with ID starting with 3d6a5b81f7fa1c6fd042eea99abb5729339f07095057559b7ba1090e6777f58b not found: ID does not exist" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.406956 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-combined-ca-bundle\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407013 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-swiftconf\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-etc-swift\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407117 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-ring-data-devices\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407136 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-scripts\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407194 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trbj4\" (UniqueName: \"kubernetes.io/projected/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-kube-api-access-trbj4\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407268 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-dispersionconf\") pod \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\" (UID: \"f58edcb1-a2c4-4d8d-b004-91eafbcfab4b\") " Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.407900 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.408162 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.412706 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.415330 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.415487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-scripts" (OuterVolumeSpecName: "scripts") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.415587 4797 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.415612 4797 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.415938 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.421644 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-kube-api-access-trbj4" (OuterVolumeSpecName: "kube-api-access-trbj4") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "kube-api-access-trbj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.429396 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" (UID: "f58edcb1-a2c4-4d8d-b004-91eafbcfab4b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.517702 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.518907 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trbj4\" (UniqueName: \"kubernetes.io/projected/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-kube-api-access-trbj4\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.518931 4797 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.518949 4797 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:16 crc kubenswrapper[4797]: I0930 18:00:16.615686 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cx5sg"] Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.026166 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4r4kf"] Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.027413 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.028264 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:17 crc kubenswrapper[4797]: E0930 18:00:17.028450 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 18:00:17 crc kubenswrapper[4797]: E0930 18:00:17.028480 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 18:00:17 crc kubenswrapper[4797]: E0930 18:00:17.028530 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift podName:4fecceec-298d-4979-b468-5fe35c9b68e7 nodeName:}" failed. No retries permitted until 2025-09-30 18:00:19.028513538 +0000 UTC m=+1069.551012776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift") pod "swift-storage-0" (UID: "4fecceec-298d-4979-b468-5fe35c9b68e7") : configmap "swift-ring-files" not found Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.034713 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4r4kf"] Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.130524 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99h6r\" (UniqueName: \"kubernetes.io/projected/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6-kube-api-access-99h6r\") pod \"glance-db-create-4r4kf\" (UID: \"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6\") " pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.232743 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99h6r\" (UniqueName: \"kubernetes.io/projected/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6-kube-api-access-99h6r\") pod \"glance-db-create-4r4kf\" (UID: \"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6\") " pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.262933 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cx5sg" event={"ID":"139c3278-4f30-418b-ae01-2ea9ac63ab55","Type":"ContainerStarted","Data":"07b587c7e1f98929a9c70c20036a595eaba63ec2e7e3d8cd255c244514504e70"} Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.267166 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8zjg" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.267210 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4hzw4" event={"ID":"4d5a74b4-24b3-4369-bda9-4de7e98d9821","Type":"ContainerStarted","Data":"e70e34b1e2f2abdb7afeaa78eb3df112af3e91dc117ecf5a8078b286412a83b4"} Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.267256 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.270268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99h6r\" (UniqueName: \"kubernetes.io/projected/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6-kube-api-access-99h6r\") pod \"glance-db-create-4r4kf\" (UID: \"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6\") " pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.304508 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podStartSLOduration=3.304490201 podStartE2EDuration="3.304490201s" podCreationTimestamp="2025-09-30 18:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:17.294174169 +0000 UTC m=+1067.816673427" watchObservedRunningTime="2025-09-30 18:00:17.304490201 +0000 UTC m=+1067.826989439" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.367230 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s8zjg"] Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.367299 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-s8zjg"] Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.374378 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.723188 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.847511 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh9f7\" (UniqueName: \"kubernetes.io/projected/efe9cbac-3168-4887-81ea-51d04c2a70c8-kube-api-access-fh9f7\") pod \"efe9cbac-3168-4887-81ea-51d04c2a70c8\" (UID: \"efe9cbac-3168-4887-81ea-51d04c2a70c8\") " Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.853334 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe9cbac-3168-4887-81ea-51d04c2a70c8-kube-api-access-fh9f7" (OuterVolumeSpecName: "kube-api-access-fh9f7") pod "efe9cbac-3168-4887-81ea-51d04c2a70c8" (UID: "efe9cbac-3168-4887-81ea-51d04c2a70c8"). InnerVolumeSpecName "kube-api-access-fh9f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.875571 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4r4kf"] Sep 30 18:00:17 crc kubenswrapper[4797]: W0930 18:00:17.880888 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b3b6e16_37d0_4b8e_96da_1a4c21c30af6.slice/crio-0dfc782aa8901e460845a9739079ba6a5d70dabb496ffb20cac8990588bf6e0d WatchSource:0}: Error finding container 0dfc782aa8901e460845a9739079ba6a5d70dabb496ffb20cac8990588bf6e0d: Status 404 returned error can't find the container with id 0dfc782aa8901e460845a9739079ba6a5d70dabb496ffb20cac8990588bf6e0d Sep 30 18:00:17 crc kubenswrapper[4797]: I0930 18:00:17.950440 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh9f7\" (UniqueName: \"kubernetes.io/projected/efe9cbac-3168-4887-81ea-51d04c2a70c8-kube-api-access-fh9f7\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:18 crc kubenswrapper[4797]: I0930 18:00:18.248750 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b" path="/var/lib/kubelet/pods/68b471ec-3bb7-4cee-9ab1-8ea7a2d0619b/volumes" Sep 30 18:00:18 crc kubenswrapper[4797]: I0930 18:00:18.249884 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58edcb1-a2c4-4d8d-b004-91eafbcfab4b" path="/var/lib/kubelet/pods/f58edcb1-a2c4-4d8d-b004-91eafbcfab4b/volumes" Sep 30 18:00:18 crc kubenswrapper[4797]: I0930 18:00:18.276018 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4r4kf" event={"ID":"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6","Type":"ContainerStarted","Data":"0dfc782aa8901e460845a9739079ba6a5d70dabb496ffb20cac8990588bf6e0d"} Sep 30 18:00:18 crc kubenswrapper[4797]: I0930 18:00:18.278562 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dfjp2" Sep 30 18:00:18 crc kubenswrapper[4797]: I0930 18:00:18.278798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dfjp2" event={"ID":"efe9cbac-3168-4887-81ea-51d04c2a70c8","Type":"ContainerDied","Data":"767f28db3e094b1e1e477e3e66f95ceb1e92f8dc4d0d4c09dc4d365c645d912f"} Sep 30 18:00:18 crc kubenswrapper[4797]: I0930 18:00:18.278820 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767f28db3e094b1e1e477e3e66f95ceb1e92f8dc4d0d4c09dc4d365c645d912f" Sep 30 18:00:19 crc kubenswrapper[4797]: I0930 18:00:19.079969 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:19 crc kubenswrapper[4797]: E0930 18:00:19.080268 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 18:00:19 crc kubenswrapper[4797]: E0930 18:00:19.080323 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 18:00:19 crc kubenswrapper[4797]: E0930 18:00:19.080414 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift podName:4fecceec-298d-4979-b468-5fe35c9b68e7 nodeName:}" failed. No retries permitted until 2025-09-30 18:00:23.080384535 +0000 UTC m=+1073.602883813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift") pod "swift-storage-0" (UID: "4fecceec-298d-4979-b468-5fe35c9b68e7") : configmap "swift-ring-files" not found Sep 30 18:00:19 crc kubenswrapper[4797]: I0930 18:00:19.293783 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4r4kf" event={"ID":"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6","Type":"ContainerStarted","Data":"17759ec73c8e06d52425478a32d5c9292e4a5f2df9f2b0e9cc066595b7f1bf37"} Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.525194 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2ktxf"] Sep 30 18:00:21 crc kubenswrapper[4797]: E0930 18:00:21.526222 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe9cbac-3168-4887-81ea-51d04c2a70c8" containerName="mariadb-database-create" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.526590 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe9cbac-3168-4887-81ea-51d04c2a70c8" containerName="mariadb-database-create" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.527070 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe9cbac-3168-4887-81ea-51d04c2a70c8" containerName="mariadb-database-create" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.528344 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.550664 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2ktxf"] Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.624805 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97v4l\" (UniqueName: \"kubernetes.io/projected/0c787cf9-c86e-47f2-a4a9-30c4edc09890-kube-api-access-97v4l\") pod \"keystone-db-create-2ktxf\" (UID: \"0c787cf9-c86e-47f2-a4a9-30c4edc09890\") " pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.726633 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97v4l\" (UniqueName: \"kubernetes.io/projected/0c787cf9-c86e-47f2-a4a9-30c4edc09890-kube-api-access-97v4l\") pod \"keystone-db-create-2ktxf\" (UID: \"0c787cf9-c86e-47f2-a4a9-30c4edc09890\") " pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.753529 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97v4l\" (UniqueName: \"kubernetes.io/projected/0c787cf9-c86e-47f2-a4a9-30c4edc09890-kube-api-access-97v4l\") pod \"keystone-db-create-2ktxf\" (UID: \"0c787cf9-c86e-47f2-a4a9-30c4edc09890\") " pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.795849 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6t2ts"] Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.797814 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.847947 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6t2ts"] Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.848170 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:21 crc kubenswrapper[4797]: I0930 18:00:21.929145 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6sp\" (UniqueName: \"kubernetes.io/projected/e61c85c6-fd1a-4e9d-884d-7793317d857a-kube-api-access-2h6sp\") pod \"placement-db-create-6t2ts\" (UID: \"e61c85c6-fd1a-4e9d-884d-7793317d857a\") " pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:22 crc kubenswrapper[4797]: I0930 18:00:22.030379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6sp\" (UniqueName: \"kubernetes.io/projected/e61c85c6-fd1a-4e9d-884d-7793317d857a-kube-api-access-2h6sp\") pod \"placement-db-create-6t2ts\" (UID: \"e61c85c6-fd1a-4e9d-884d-7793317d857a\") " pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:22 crc kubenswrapper[4797]: I0930 18:00:22.050874 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6sp\" (UniqueName: \"kubernetes.io/projected/e61c85c6-fd1a-4e9d-884d-7793317d857a-kube-api-access-2h6sp\") pod \"placement-db-create-6t2ts\" (UID: \"e61c85c6-fd1a-4e9d-884d-7793317d857a\") " pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:22 crc kubenswrapper[4797]: I0930 18:00:22.160721 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:22 crc kubenswrapper[4797]: I0930 18:00:22.314934 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2ktxf"] Sep 30 18:00:22 crc kubenswrapper[4797]: W0930 18:00:22.315401 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c787cf9_c86e_47f2_a4a9_30c4edc09890.slice/crio-dbf4d11cae016e1a1f176dd8b32c9bb6ec364ff5255c49c704702e82910bd4df WatchSource:0}: Error finding container dbf4d11cae016e1a1f176dd8b32c9bb6ec364ff5255c49c704702e82910bd4df: Status 404 returned error can't find the container with id dbf4d11cae016e1a1f176dd8b32c9bb6ec364ff5255c49c704702e82910bd4df Sep 30 18:00:22 crc kubenswrapper[4797]: I0930 18:00:22.351050 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4r4kf" podStartSLOduration=5.351026428 podStartE2EDuration="5.351026428s" podCreationTimestamp="2025-09-30 18:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:22.335395412 +0000 UTC m=+1072.857894650" watchObservedRunningTime="2025-09-30 18:00:22.351026428 +0000 UTC m=+1072.873525676" Sep 30 18:00:22 crc kubenswrapper[4797]: I0930 18:00:22.581933 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6t2ts"] Sep 30 18:00:23 crc kubenswrapper[4797]: I0930 18:00:23.152586 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:23 crc kubenswrapper[4797]: E0930 18:00:23.152798 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 18:00:23 crc kubenswrapper[4797]: E0930 18:00:23.152811 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 18:00:23 crc kubenswrapper[4797]: E0930 18:00:23.152861 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift podName:4fecceec-298d-4979-b468-5fe35c9b68e7 nodeName:}" failed. No retries permitted until 2025-09-30 18:00:31.152848214 +0000 UTC m=+1081.675347452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift") pod "swift-storage-0" (UID: "4fecceec-298d-4979-b468-5fe35c9b68e7") : configmap "swift-ring-files" not found Sep 30 18:00:23 crc kubenswrapper[4797]: I0930 18:00:23.330739 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2ktxf" event={"ID":"0c787cf9-c86e-47f2-a4a9-30c4edc09890","Type":"ContainerStarted","Data":"dbf4d11cae016e1a1f176dd8b32c9bb6ec364ff5255c49c704702e82910bd4df"} Sep 30 18:00:23 crc kubenswrapper[4797]: I0930 18:00:23.332896 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6t2ts" event={"ID":"e61c85c6-fd1a-4e9d-884d-7793317d857a","Type":"ContainerStarted","Data":"f54bbcf6952c3beed990022efea862c0594836174d76a8a45013358b21b122d7"} Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.089342 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-fc85-account-create-48fpf"] Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.090606 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.094009 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.099980 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-fc85-account-create-48fpf"] Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.173356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl8rq\" (UniqueName: \"kubernetes.io/projected/6adb5919-faea-4783-819a-867e38b3d90f-kube-api-access-vl8rq\") pod \"watcher-fc85-account-create-48fpf\" (UID: \"6adb5919-faea-4783-819a-867e38b3d90f\") " pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.274522 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl8rq\" (UniqueName: \"kubernetes.io/projected/6adb5919-faea-4783-819a-867e38b3d90f-kube-api-access-vl8rq\") pod \"watcher-fc85-account-create-48fpf\" (UID: \"6adb5919-faea-4783-819a-867e38b3d90f\") " pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.297077 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl8rq\" (UniqueName: \"kubernetes.io/projected/6adb5919-faea-4783-819a-867e38b3d90f-kube-api-access-vl8rq\") pod \"watcher-fc85-account-create-48fpf\" (UID: \"6adb5919-faea-4783-819a-867e38b3d90f\") " pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.351747 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2ktxf" event={"ID":"0c787cf9-c86e-47f2-a4a9-30c4edc09890","Type":"ContainerStarted","Data":"08d22fc23d4e511563e7b9dd2ce31315f91fb29ec4579749adfbcc939316fd48"} Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.354403 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6t2ts" event={"ID":"e61c85c6-fd1a-4e9d-884d-7793317d857a","Type":"ContainerStarted","Data":"7402d957f9883b0b8c6151beed01acebd3f907b6e33aad141ac646d4d1244c33"} Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.356039 4797 generic.go:334] "Generic (PLEG): container finished" podID="4b3b6e16-37d0-4b8e-96da-1a4c21c30af6" containerID="17759ec73c8e06d52425478a32d5c9292e4a5f2df9f2b0e9cc066595b7f1bf37" exitCode=0 Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.356076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4r4kf" event={"ID":"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6","Type":"ContainerDied","Data":"17759ec73c8e06d52425478a32d5c9292e4a5f2df9f2b0e9cc066595b7f1bf37"} Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.369930 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-2ktxf" podStartSLOduration=3.3699134649999998 podStartE2EDuration="3.369913465s" podCreationTimestamp="2025-09-30 18:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:24.368792184 +0000 UTC m=+1074.891291422" watchObservedRunningTime="2025-09-30 18:00:24.369913465 +0000 UTC m=+1074.892412693" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.388311 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-6t2ts" podStartSLOduration=3.388295106 podStartE2EDuration="3.388295106s" podCreationTimestamp="2025-09-30 18:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:24.381273895 +0000 UTC m=+1074.903773153" watchObservedRunningTime="2025-09-30 18:00:24.388295106 +0000 UTC m=+1074.910794344" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.423858 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.547750 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.608215 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ch6v6"] Sep 30 18:00:24 crc kubenswrapper[4797]: I0930 18:00:24.608456 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="dnsmasq-dns" containerID="cri-o://97b5b330253b6b7cc103ae85db3cac071b3ba1d86f1bb1405b4908eb9e697091" gracePeriod=10 Sep 30 18:00:25 crc kubenswrapper[4797]: I0930 18:00:25.365809 4797 generic.go:334] "Generic (PLEG): container finished" podID="0c787cf9-c86e-47f2-a4a9-30c4edc09890" containerID="08d22fc23d4e511563e7b9dd2ce31315f91fb29ec4579749adfbcc939316fd48" exitCode=0 Sep 30 18:00:25 crc kubenswrapper[4797]: I0930 18:00:25.365878 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2ktxf" event={"ID":"0c787cf9-c86e-47f2-a4a9-30c4edc09890","Type":"ContainerDied","Data":"08d22fc23d4e511563e7b9dd2ce31315f91fb29ec4579749adfbcc939316fd48"} Sep 30 18:00:25 crc kubenswrapper[4797]: I0930 18:00:25.376592 4797 generic.go:334] "Generic (PLEG): container finished" podID="e61c85c6-fd1a-4e9d-884d-7793317d857a" containerID="7402d957f9883b0b8c6151beed01acebd3f907b6e33aad141ac646d4d1244c33" exitCode=0 Sep 30 18:00:25 crc kubenswrapper[4797]: I0930 18:00:25.376769 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6t2ts" event={"ID":"e61c85c6-fd1a-4e9d-884d-7793317d857a","Type":"ContainerDied","Data":"7402d957f9883b0b8c6151beed01acebd3f907b6e33aad141ac646d4d1244c33"} Sep 30 18:00:25 crc kubenswrapper[4797]: I0930 18:00:25.389600 4797 generic.go:334] "Generic (PLEG): container finished" podID="9e788f55-44b8-4d71-b419-023ad236d45c" containerID="97b5b330253b6b7cc103ae85db3cac071b3ba1d86f1bb1405b4908eb9e697091" exitCode=0 Sep 30 18:00:25 crc kubenswrapper[4797]: I0930 18:00:25.389777 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" event={"ID":"9e788f55-44b8-4d71-b419-023ad236d45c","Type":"ContainerDied","Data":"97b5b330253b6b7cc103ae85db3cac071b3ba1d86f1bb1405b4908eb9e697091"} Sep 30 18:00:26 crc kubenswrapper[4797]: I0930 18:00:26.634567 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:26 crc kubenswrapper[4797]: I0930 18:00:26.714162 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99h6r\" (UniqueName: \"kubernetes.io/projected/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6-kube-api-access-99h6r\") pod \"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6\" (UID: \"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6\") " Sep 30 18:00:26 crc kubenswrapper[4797]: I0930 18:00:26.723592 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6-kube-api-access-99h6r" (OuterVolumeSpecName: "kube-api-access-99h6r") pod "4b3b6e16-37d0-4b8e-96da-1a4c21c30af6" (UID: "4b3b6e16-37d0-4b8e-96da-1a4c21c30af6"). InnerVolumeSpecName "kube-api-access-99h6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:26 crc kubenswrapper[4797]: I0930 18:00:26.815753 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99h6r\" (UniqueName: \"kubernetes.io/projected/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6-kube-api-access-99h6r\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:26 crc kubenswrapper[4797]: I0930 18:00:26.853706 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.407564 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4r4kf" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.407557 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4r4kf" event={"ID":"4b3b6e16-37d0-4b8e-96da-1a4c21c30af6","Type":"ContainerDied","Data":"0dfc782aa8901e460845a9739079ba6a5d70dabb496ffb20cac8990588bf6e0d"} Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.407901 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dfc782aa8901e460845a9739079ba6a5d70dabb496ffb20cac8990588bf6e0d" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.410257 4797 generic.go:334] "Generic (PLEG): container finished" podID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerID="32d899fb0b4dcce53cc91d0ae0c0891dd0383d1706912339fb8c5a31d211f1b7" exitCode=0 Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.410309 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"65610b42-1ed9-4a27-996a-09e0ebd560e5","Type":"ContainerDied","Data":"32d899fb0b4dcce53cc91d0ae0c0891dd0383d1706912339fb8c5a31d211f1b7"} Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.415289 4797 generic.go:334] "Generic (PLEG): container finished" podID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerID="ddea9aed82f65ba1df21799ed2ffbae0368261d2ae6cbde18121dc443bad437c" exitCode=0 Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.415328 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a69c5e9-777c-48ad-8af7-78e770d2a9b2","Type":"ContainerDied","Data":"ddea9aed82f65ba1df21799ed2ffbae0368261d2ae6cbde18121dc443bad437c"} Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.743349 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.750257 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.758873 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.836009 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97v4l\" (UniqueName: \"kubernetes.io/projected/0c787cf9-c86e-47f2-a4a9-30c4edc09890-kube-api-access-97v4l\") pod \"0c787cf9-c86e-47f2-a4a9-30c4edc09890\" (UID: \"0c787cf9-c86e-47f2-a4a9-30c4edc09890\") " Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.836069 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6sp\" (UniqueName: \"kubernetes.io/projected/e61c85c6-fd1a-4e9d-884d-7793317d857a-kube-api-access-2h6sp\") pod \"e61c85c6-fd1a-4e9d-884d-7793317d857a\" (UID: \"e61c85c6-fd1a-4e9d-884d-7793317d857a\") " Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.836132 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thqfb\" (UniqueName: \"kubernetes.io/projected/9e788f55-44b8-4d71-b419-023ad236d45c-kube-api-access-thqfb\") pod \"9e788f55-44b8-4d71-b419-023ad236d45c\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.836248 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-config\") pod \"9e788f55-44b8-4d71-b419-023ad236d45c\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.836270 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-dns-svc\") pod \"9e788f55-44b8-4d71-b419-023ad236d45c\" (UID: \"9e788f55-44b8-4d71-b419-023ad236d45c\") " Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.856084 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61c85c6-fd1a-4e9d-884d-7793317d857a-kube-api-access-2h6sp" (OuterVolumeSpecName: "kube-api-access-2h6sp") pod "e61c85c6-fd1a-4e9d-884d-7793317d857a" (UID: "e61c85c6-fd1a-4e9d-884d-7793317d857a"). InnerVolumeSpecName "kube-api-access-2h6sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.856643 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c787cf9-c86e-47f2-a4a9-30c4edc09890-kube-api-access-97v4l" (OuterVolumeSpecName: "kube-api-access-97v4l") pod "0c787cf9-c86e-47f2-a4a9-30c4edc09890" (UID: "0c787cf9-c86e-47f2-a4a9-30c4edc09890"). InnerVolumeSpecName "kube-api-access-97v4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.866767 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e788f55-44b8-4d71-b419-023ad236d45c-kube-api-access-thqfb" (OuterVolumeSpecName: "kube-api-access-thqfb") pod "9e788f55-44b8-4d71-b419-023ad236d45c" (UID: "9e788f55-44b8-4d71-b419-023ad236d45c"). InnerVolumeSpecName "kube-api-access-thqfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.897095 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e788f55-44b8-4d71-b419-023ad236d45c" (UID: "9e788f55-44b8-4d71-b419-023ad236d45c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.901789 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-config" (OuterVolumeSpecName: "config") pod "9e788f55-44b8-4d71-b419-023ad236d45c" (UID: "9e788f55-44b8-4d71-b419-023ad236d45c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.938528 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thqfb\" (UniqueName: \"kubernetes.io/projected/9e788f55-44b8-4d71-b419-023ad236d45c-kube-api-access-thqfb\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.938565 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.938574 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e788f55-44b8-4d71-b419-023ad236d45c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.938582 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97v4l\" (UniqueName: \"kubernetes.io/projected/0c787cf9-c86e-47f2-a4a9-30c4edc09890-kube-api-access-97v4l\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:27 crc kubenswrapper[4797]: I0930 18:00:27.938591 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h6sp\" (UniqueName: \"kubernetes.io/projected/e61c85c6-fd1a-4e9d-884d-7793317d857a-kube-api-access-2h6sp\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.425510 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2ktxf" event={"ID":"0c787cf9-c86e-47f2-a4a9-30c4edc09890","Type":"ContainerDied","Data":"dbf4d11cae016e1a1f176dd8b32c9bb6ec364ff5255c49c704702e82910bd4df"} Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.425782 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf4d11cae016e1a1f176dd8b32c9bb6ec364ff5255c49c704702e82910bd4df" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.425613 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2ktxf" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.427986 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6t2ts" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.427998 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6t2ts" event={"ID":"e61c85c6-fd1a-4e9d-884d-7793317d857a","Type":"ContainerDied","Data":"f54bbcf6952c3beed990022efea862c0594836174d76a8a45013358b21b122d7"} Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.428056 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f54bbcf6952c3beed990022efea862c0594836174d76a8a45013358b21b122d7" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.430812 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" event={"ID":"9e788f55-44b8-4d71-b419-023ad236d45c","Type":"ContainerDied","Data":"2def468362e5d068b187faaea74d4262d28dc85cb19d4b1aab19d4ca79c47539"} Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.430859 4797 scope.go:117] "RemoveContainer" containerID="97b5b330253b6b7cc103ae85db3cac071b3ba1d86f1bb1405b4908eb9e697091" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.430868 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.454135 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ch6v6"] Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.459596 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ch6v6"] Sep 30 18:00:28 crc kubenswrapper[4797]: I0930 18:00:28.998691 4797 scope.go:117] "RemoveContainer" containerID="3a0cebdf71004c535d1e53fbec5c1a63de2092a9bc3cb6a3390da928948412e8" Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.396144 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-fc85-account-create-48fpf"] Sep 30 18:00:29 crc kubenswrapper[4797]: W0930 18:00:29.401792 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6adb5919_faea_4783_819a_867e38b3d90f.slice/crio-ab77fcb879cf26253e4626891ea7213f688ebc9decae77c899c2038c8b1462f6 WatchSource:0}: Error finding container ab77fcb879cf26253e4626891ea7213f688ebc9decae77c899c2038c8b1462f6: Status 404 returned error can't find the container with id ab77fcb879cf26253e4626891ea7213f688ebc9decae77c899c2038c8b1462f6 Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.439825 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-fc85-account-create-48fpf" event={"ID":"6adb5919-faea-4783-819a-867e38b3d90f","Type":"ContainerStarted","Data":"ab77fcb879cf26253e4626891ea7213f688ebc9decae77c899c2038c8b1462f6"} Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.443728 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerStarted","Data":"22131c6a5fa0809a5686cd490b3bff816c4adae27c7c7ba69b0e39a7803fa7d0"} Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.449008 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"65610b42-1ed9-4a27-996a-09e0ebd560e5","Type":"ContainerStarted","Data":"df96003e9fe01c2d21ccf4fa5f58d3eaab27428f49a64ad0f4d986716752d52e"} Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.449199 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.451845 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a69c5e9-777c-48ad-8af7-78e770d2a9b2","Type":"ContainerStarted","Data":"4c7e640eaf85c51f57b6cf5bb940418082a77d3851b444a2154f6865b2d5de23"} Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.452028 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.453092 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cx5sg" event={"ID":"139c3278-4f30-418b-ae01-2ea9ac63ab55","Type":"ContainerStarted","Data":"33b828e5d1b6a33e35b18e2397ef6efea8bfcdec391020b8dc7a7ffe5abaa204"} Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.486873 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.152133797 podStartE2EDuration="52.486849444s" podCreationTimestamp="2025-09-30 17:59:37 +0000 UTC" firstStartedPulling="2025-09-30 17:59:43.056460674 +0000 UTC m=+1033.578959912" lastFinishedPulling="2025-09-30 17:59:56.391176321 +0000 UTC m=+1046.913675559" observedRunningTime="2025-09-30 18:00:29.478552267 +0000 UTC m=+1080.001051515" watchObservedRunningTime="2025-09-30 18:00:29.486849444 +0000 UTC m=+1080.009348682" Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.500337 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.215958048 podStartE2EDuration="52.500311411s" podCreationTimestamp="2025-09-30 17:59:37 +0000 UTC" firstStartedPulling="2025-09-30 17:59:43.056117604 +0000 UTC m=+1033.578616842" lastFinishedPulling="2025-09-30 17:59:56.340470967 +0000 UTC m=+1046.862970205" observedRunningTime="2025-09-30 18:00:29.49913346 +0000 UTC m=+1080.021632738" watchObservedRunningTime="2025-09-30 18:00:29.500311411 +0000 UTC m=+1080.022810649" Sep 30 18:00:29 crc kubenswrapper[4797]: I0930 18:00:29.531888 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cx5sg" podStartSLOduration=2.0926648979999998 podStartE2EDuration="14.531862832s" podCreationTimestamp="2025-09-30 18:00:15 +0000 UTC" firstStartedPulling="2025-09-30 18:00:16.63109724 +0000 UTC m=+1067.153596478" lastFinishedPulling="2025-09-30 18:00:29.070295164 +0000 UTC m=+1079.592794412" observedRunningTime="2025-09-30 18:00:29.527799872 +0000 UTC m=+1080.050299110" watchObservedRunningTime="2025-09-30 18:00:29.531862832 +0000 UTC m=+1080.054362090" Sep 30 18:00:30 crc kubenswrapper[4797]: I0930 18:00:30.254182 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" path="/var/lib/kubelet/pods/9e788f55-44b8-4d71-b419-023ad236d45c/volumes" Sep 30 18:00:30 crc kubenswrapper[4797]: I0930 18:00:30.462833 4797 generic.go:334] "Generic (PLEG): container finished" podID="6adb5919-faea-4783-819a-867e38b3d90f" containerID="d95becb5ce012d2a876f6fe02de65b310ac3b6aedd147337d1387a2bfd04f8b5" exitCode=0 Sep 30 18:00:30 crc kubenswrapper[4797]: I0930 18:00:30.462990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-fc85-account-create-48fpf" event={"ID":"6adb5919-faea-4783-819a-867e38b3d90f","Type":"ContainerDied","Data":"d95becb5ce012d2a876f6fe02de65b310ac3b6aedd147337d1387a2bfd04f8b5"} Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.210412 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.210670 4797 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.210712 4797 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.210792 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift podName:4fecceec-298d-4979-b468-5fe35c9b68e7 nodeName:}" failed. No retries permitted until 2025-09-30 18:00:47.2107691 +0000 UTC m=+1097.733268348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift") pod "swift-storage-0" (UID: "4fecceec-298d-4979-b468-5fe35c9b68e7") : configmap "swift-ring-files" not found Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.651576 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7981-account-create-2vvns"] Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.652256 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="init" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652267 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="init" Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.652282 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61c85c6-fd1a-4e9d-884d-7793317d857a" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652289 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61c85c6-fd1a-4e9d-884d-7793317d857a" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.652300 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c787cf9-c86e-47f2-a4a9-30c4edc09890" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652309 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c787cf9-c86e-47f2-a4a9-30c4edc09890" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.652323 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="dnsmasq-dns" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652329 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="dnsmasq-dns" Sep 30 18:00:31 crc kubenswrapper[4797]: E0930 18:00:31.652340 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3b6e16-37d0-4b8e-96da-1a4c21c30af6" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652346 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3b6e16-37d0-4b8e-96da-1a4c21c30af6" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652522 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3b6e16-37d0-4b8e-96da-1a4c21c30af6" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652541 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="dnsmasq-dns" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652554 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c787cf9-c86e-47f2-a4a9-30c4edc09890" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.652569 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61c85c6-fd1a-4e9d-884d-7793317d857a" containerName="mariadb-database-create" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.655752 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.662412 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.672882 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7981-account-create-2vvns"] Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.722400 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545b6\" (UniqueName: \"kubernetes.io/projected/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2-kube-api-access-545b6\") pod \"keystone-7981-account-create-2vvns\" (UID: \"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2\") " pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.823513 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-545b6\" (UniqueName: \"kubernetes.io/projected/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2-kube-api-access-545b6\") pod \"keystone-7981-account-create-2vvns\" (UID: \"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2\") " pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.851860 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-545b6\" (UniqueName: \"kubernetes.io/projected/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2-kube-api-access-545b6\") pod \"keystone-7981-account-create-2vvns\" (UID: \"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2\") " pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.912218 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.924732 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl8rq\" (UniqueName: \"kubernetes.io/projected/6adb5919-faea-4783-819a-867e38b3d90f-kube-api-access-vl8rq\") pod \"6adb5919-faea-4783-819a-867e38b3d90f\" (UID: \"6adb5919-faea-4783-819a-867e38b3d90f\") " Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.938733 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adb5919-faea-4783-819a-867e38b3d90f-kube-api-access-vl8rq" (OuterVolumeSpecName: "kube-api-access-vl8rq") pod "6adb5919-faea-4783-819a-867e38b3d90f" (UID: "6adb5919-faea-4783-819a-867e38b3d90f"). InnerVolumeSpecName "kube-api-access-vl8rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:31 crc kubenswrapper[4797]: I0930 18:00:31.979170 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.025922 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl8rq\" (UniqueName: \"kubernetes.io/projected/6adb5919-faea-4783-819a-867e38b3d90f-kube-api-access-vl8rq\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.420870 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7981-account-create-2vvns"] Sep 30 18:00:32 crc kubenswrapper[4797]: W0930 18:00:32.429117 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeedc91e0_c1c4_41b5_90f7_415b32cd9ca2.slice/crio-3b92bd48ca5755e5a0a51cf25c1263462d4aa9633a1be65da3df882e063dc4c6 WatchSource:0}: Error finding container 3b92bd48ca5755e5a0a51cf25c1263462d4aa9633a1be65da3df882e063dc4c6: Status 404 returned error can't find the container with id 3b92bd48ca5755e5a0a51cf25c1263462d4aa9633a1be65da3df882e063dc4c6 Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.485657 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-fc85-account-create-48fpf" event={"ID":"6adb5919-faea-4783-819a-867e38b3d90f","Type":"ContainerDied","Data":"ab77fcb879cf26253e4626891ea7213f688ebc9decae77c899c2038c8b1462f6"} Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.485698 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab77fcb879cf26253e4626891ea7213f688ebc9decae77c899c2038c8b1462f6" Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.485760 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-fc85-account-create-48fpf" Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.490222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerStarted","Data":"573b550a3cbf98d75ef2ead90adcddfbd1f7a828fc537cdba9456a9a513ea048"} Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.491772 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7981-account-create-2vvns" event={"ID":"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2","Type":"ContainerStarted","Data":"3b92bd48ca5755e5a0a51cf25c1263462d4aa9633a1be65da3df882e063dc4c6"} Sep 30 18:00:32 crc kubenswrapper[4797]: I0930 18:00:32.521865 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-ch6v6" podUID="9e788f55-44b8-4d71-b419-023ad236d45c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: i/o timeout" Sep 30 18:00:33 crc kubenswrapper[4797]: I0930 18:00:33.501704 4797 generic.go:334] "Generic (PLEG): container finished" podID="eedc91e0-c1c4-41b5-90f7-415b32cd9ca2" containerID="7a07c0a556d68c593cf31c83956e7f32fd3822c0fce425b020e522c3dd16ab26" exitCode=0 Sep 30 18:00:33 crc kubenswrapper[4797]: I0930 18:00:33.502115 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7981-account-create-2vvns" event={"ID":"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2","Type":"ContainerDied","Data":"7a07c0a556d68c593cf31c83956e7f32fd3822c0fce425b020e522c3dd16ab26"} Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.094880 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.290647 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-545b6\" (UniqueName: \"kubernetes.io/projected/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2-kube-api-access-545b6\") pod \"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2\" (UID: \"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2\") " Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.312338 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2-kube-api-access-545b6" (OuterVolumeSpecName: "kube-api-access-545b6") pod "eedc91e0-c1c4-41b5-90f7-415b32cd9ca2" (UID: "eedc91e0-c1c4-41b5-90f7-415b32cd9ca2"). InnerVolumeSpecName "kube-api-access-545b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.393082 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-545b6\" (UniqueName: \"kubernetes.io/projected/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2-kube-api-access-545b6\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.519594 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7981-account-create-2vvns" event={"ID":"eedc91e0-c1c4-41b5-90f7-415b32cd9ca2","Type":"ContainerDied","Data":"3b92bd48ca5755e5a0a51cf25c1263462d4aa9633a1be65da3df882e063dc4c6"} Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.519637 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b92bd48ca5755e5a0a51cf25c1263462d4aa9633a1be65da3df882e063dc4c6" Sep 30 18:00:35 crc kubenswrapper[4797]: I0930 18:00:35.519678 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7981-account-create-2vvns" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.231687 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-96eb-account-create-5nvh4"] Sep 30 18:00:37 crc kubenswrapper[4797]: E0930 18:00:37.232298 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adb5919-faea-4783-819a-867e38b3d90f" containerName="mariadb-account-create" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.232310 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adb5919-faea-4783-819a-867e38b3d90f" containerName="mariadb-account-create" Sep 30 18:00:37 crc kubenswrapper[4797]: E0930 18:00:37.232325 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedc91e0-c1c4-41b5-90f7-415b32cd9ca2" containerName="mariadb-account-create" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.232331 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedc91e0-c1c4-41b5-90f7-415b32cd9ca2" containerName="mariadb-account-create" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.232541 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedc91e0-c1c4-41b5-90f7-415b32cd9ca2" containerName="mariadb-account-create" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.232555 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adb5919-faea-4783-819a-867e38b3d90f" containerName="mariadb-account-create" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.233307 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.240818 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.250782 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-96eb-account-create-5nvh4"] Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.428977 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8ssm\" (UniqueName: \"kubernetes.io/projected/54d827f7-8cc5-41fc-a453-880a27418b98-kube-api-access-p8ssm\") pod \"glance-96eb-account-create-5nvh4\" (UID: \"54d827f7-8cc5-41fc-a453-880a27418b98\") " pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.530617 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8ssm\" (UniqueName: \"kubernetes.io/projected/54d827f7-8cc5-41fc-a453-880a27418b98-kube-api-access-p8ssm\") pod \"glance-96eb-account-create-5nvh4\" (UID: \"54d827f7-8cc5-41fc-a453-880a27418b98\") " pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.550928 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8ssm\" (UniqueName: \"kubernetes.io/projected/54d827f7-8cc5-41fc-a453-880a27418b98-kube-api-access-p8ssm\") pod \"glance-96eb-account-create-5nvh4\" (UID: \"54d827f7-8cc5-41fc-a453-880a27418b98\") " pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:37 crc kubenswrapper[4797]: I0930 18:00:37.565196 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.104983 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nf4pk" podUID="4a527992-92f7-4aab-b8d4-e75ec72fd684" containerName="ovn-controller" probeResult="failure" output=< Sep 30 18:00:38 crc kubenswrapper[4797]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 18:00:38 crc kubenswrapper[4797]: > Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.134675 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-96eb-account-create-5nvh4"] Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.135299 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.149806 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dqkv4" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.367571 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nf4pk-config-hnjl7"] Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.368772 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.370997 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.376377 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nf4pk-config-hnjl7"] Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.492777 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9kv\" (UniqueName: \"kubernetes.io/projected/38af9635-bc06-4376-97da-902b19369533-kube-api-access-ss9kv\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.492843 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.492970 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-additional-scripts\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.494081 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run-ovn\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.494170 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-scripts\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.494244 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-log-ovn\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.596478 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run-ovn\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.596822 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run-ovn\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.596889 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-scripts\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.596962 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-log-ovn\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.596997 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9kv\" (UniqueName: \"kubernetes.io/projected/38af9635-bc06-4376-97da-902b19369533-kube-api-access-ss9kv\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.597048 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.597127 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-additional-scripts\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.597135 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.597358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-log-ovn\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.597832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-additional-scripts\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.599322 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-scripts\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.618108 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9kv\" (UniqueName: \"kubernetes.io/projected/38af9635-bc06-4376-97da-902b19369533-kube-api-access-ss9kv\") pod \"ovn-controller-nf4pk-config-hnjl7\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:38 crc kubenswrapper[4797]: I0930 18:00:38.707400 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:39 crc kubenswrapper[4797]: W0930 18:00:39.037908 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54d827f7_8cc5_41fc_a453_880a27418b98.slice/crio-7f1d9e7a19e099fb81822a900770cc0e62ccb3509190a77b6a85b11c444b05ee WatchSource:0}: Error finding container 7f1d9e7a19e099fb81822a900770cc0e62ccb3509190a77b6a85b11c444b05ee: Status 404 returned error can't find the container with id 7f1d9e7a19e099fb81822a900770cc0e62ccb3509190a77b6a85b11c444b05ee Sep 30 18:00:39 crc kubenswrapper[4797]: I0930 18:00:39.534150 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nf4pk-config-hnjl7"] Sep 30 18:00:39 crc kubenswrapper[4797]: I0930 18:00:39.555204 4797 generic.go:334] "Generic (PLEG): container finished" podID="139c3278-4f30-418b-ae01-2ea9ac63ab55" containerID="33b828e5d1b6a33e35b18e2397ef6efea8bfcdec391020b8dc7a7ffe5abaa204" exitCode=0 Sep 30 18:00:39 crc kubenswrapper[4797]: I0930 18:00:39.555288 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cx5sg" event={"ID":"139c3278-4f30-418b-ae01-2ea9ac63ab55","Type":"ContainerDied","Data":"33b828e5d1b6a33e35b18e2397ef6efea8bfcdec391020b8dc7a7ffe5abaa204"} Sep 30 18:00:39 crc kubenswrapper[4797]: I0930 18:00:39.557832 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96eb-account-create-5nvh4" event={"ID":"54d827f7-8cc5-41fc-a453-880a27418b98","Type":"ContainerStarted","Data":"18913c8dad6b4389db5c95b8c8d2061f872585bf65da39434d5a92d70cb9b35c"} Sep 30 18:00:39 crc kubenswrapper[4797]: I0930 18:00:39.557864 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96eb-account-create-5nvh4" event={"ID":"54d827f7-8cc5-41fc-a453-880a27418b98","Type":"ContainerStarted","Data":"7f1d9e7a19e099fb81822a900770cc0e62ccb3509190a77b6a85b11c444b05ee"} Sep 30 18:00:40 crc kubenswrapper[4797]: I0930 18:00:40.572839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerStarted","Data":"3cb3fb1841e87f3f017879f644e497d684eedf449cbb5910af9c64d6a36980d2"} Sep 30 18:00:40 crc kubenswrapper[4797]: I0930 18:00:40.575808 4797 generic.go:334] "Generic (PLEG): container finished" podID="54d827f7-8cc5-41fc-a453-880a27418b98" containerID="18913c8dad6b4389db5c95b8c8d2061f872585bf65da39434d5a92d70cb9b35c" exitCode=0 Sep 30 18:00:40 crc kubenswrapper[4797]: I0930 18:00:40.575867 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96eb-account-create-5nvh4" event={"ID":"54d827f7-8cc5-41fc-a453-880a27418b98","Type":"ContainerDied","Data":"18913c8dad6b4389db5c95b8c8d2061f872585bf65da39434d5a92d70cb9b35c"} Sep 30 18:00:40 crc kubenswrapper[4797]: I0930 18:00:40.578868 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nf4pk-config-hnjl7" event={"ID":"38af9635-bc06-4376-97da-902b19369533","Type":"ContainerStarted","Data":"1af04ae03e62803cc27062452272d9a85a40afa0b07cd13b3ea375ab204604d2"} Sep 30 18:00:40 crc kubenswrapper[4797]: I0930 18:00:40.578931 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nf4pk-config-hnjl7" event={"ID":"38af9635-bc06-4376-97da-902b19369533","Type":"ContainerStarted","Data":"ebe7ad6b05574dddd33d00860671cebd50f35eea7b2dd63a6a4e441146e92ff2"} Sep 30 18:00:40 crc kubenswrapper[4797]: I0930 18:00:40.613252 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.436052621 podStartE2EDuration="57.613228754s" podCreationTimestamp="2025-09-30 17:59:43 +0000 UTC" firstStartedPulling="2025-09-30 17:59:56.785548976 +0000 UTC m=+1047.308048214" lastFinishedPulling="2025-09-30 18:00:39.962725109 +0000 UTC m=+1090.485224347" observedRunningTime="2025-09-30 18:00:40.611559269 +0000 UTC m=+1091.134058577" watchObservedRunningTime="2025-09-30 18:00:40.613228754 +0000 UTC m=+1091.135728022" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.035734 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.048879 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-ring-data-devices\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.048936 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-dispersionconf\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.048968 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-combined-ca-bundle\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.049046 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-scripts\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.049145 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-swiftconf\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.049187 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zzgl\" (UniqueName: \"kubernetes.io/projected/139c3278-4f30-418b-ae01-2ea9ac63ab55-kube-api-access-4zzgl\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.049272 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/139c3278-4f30-418b-ae01-2ea9ac63ab55-etc-swift\") pod \"139c3278-4f30-418b-ae01-2ea9ac63ab55\" (UID: \"139c3278-4f30-418b-ae01-2ea9ac63ab55\") " Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.049715 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.049817 4797 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.050948 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139c3278-4f30-418b-ae01-2ea9ac63ab55-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.087859 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.088248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.088837 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-scripts" (OuterVolumeSpecName: "scripts") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.089368 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.090203 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139c3278-4f30-418b-ae01-2ea9ac63ab55-kube-api-access-4zzgl" (OuterVolumeSpecName: "kube-api-access-4zzgl") pod "139c3278-4f30-418b-ae01-2ea9ac63ab55" (UID: "139c3278-4f30-418b-ae01-2ea9ac63ab55"). InnerVolumeSpecName "kube-api-access-4zzgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.151252 4797 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.151281 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zzgl\" (UniqueName: \"kubernetes.io/projected/139c3278-4f30-418b-ae01-2ea9ac63ab55-kube-api-access-4zzgl\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.151293 4797 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/139c3278-4f30-418b-ae01-2ea9ac63ab55-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.151305 4797 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.151316 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139c3278-4f30-418b-ae01-2ea9ac63ab55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.151324 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139c3278-4f30-418b-ae01-2ea9ac63ab55-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.593736 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cx5sg" event={"ID":"139c3278-4f30-418b-ae01-2ea9ac63ab55","Type":"ContainerDied","Data":"07b587c7e1f98929a9c70c20036a595eaba63ec2e7e3d8cd255c244514504e70"} Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.593812 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07b587c7e1f98929a9c70c20036a595eaba63ec2e7e3d8cd255c244514504e70" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.593784 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cx5sg" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.596986 4797 generic.go:334] "Generic (PLEG): container finished" podID="38af9635-bc06-4376-97da-902b19369533" containerID="1af04ae03e62803cc27062452272d9a85a40afa0b07cd13b3ea375ab204604d2" exitCode=0 Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.597382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nf4pk-config-hnjl7" event={"ID":"38af9635-bc06-4376-97da-902b19369533","Type":"ContainerDied","Data":"1af04ae03e62803cc27062452272d9a85a40afa0b07cd13b3ea375ab204604d2"} Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.941992 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e107-account-create-tcs6r"] Sep 30 18:00:41 crc kubenswrapper[4797]: E0930 18:00:41.942921 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139c3278-4f30-418b-ae01-2ea9ac63ab55" containerName="swift-ring-rebalance" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.942940 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="139c3278-4f30-418b-ae01-2ea9ac63ab55" containerName="swift-ring-rebalance" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.943158 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="139c3278-4f30-418b-ae01-2ea9ac63ab55" containerName="swift-ring-rebalance" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.944324 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.948342 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 18:00:41 crc kubenswrapper[4797]: I0930 18:00:41.960779 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e107-account-create-tcs6r"] Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.066024 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dwb4\" (UniqueName: \"kubernetes.io/projected/b1f0af8c-9d47-4ab5-807d-7853398fcf78-kube-api-access-9dwb4\") pod \"placement-e107-account-create-tcs6r\" (UID: \"b1f0af8c-9d47-4ab5-807d-7853398fcf78\") " pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.106884 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.112577 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.167540 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dwb4\" (UniqueName: \"kubernetes.io/projected/b1f0af8c-9d47-4ab5-807d-7853398fcf78-kube-api-access-9dwb4\") pod \"placement-e107-account-create-tcs6r\" (UID: \"b1f0af8c-9d47-4ab5-807d-7853398fcf78\") " pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.192722 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dwb4\" (UniqueName: \"kubernetes.io/projected/b1f0af8c-9d47-4ab5-807d-7853398fcf78-kube-api-access-9dwb4\") pod \"placement-e107-account-create-tcs6r\" (UID: \"b1f0af8c-9d47-4ab5-807d-7853398fcf78\") " pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.264790 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269222 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9kv\" (UniqueName: \"kubernetes.io/projected/38af9635-bc06-4376-97da-902b19369533-kube-api-access-ss9kv\") pod \"38af9635-bc06-4376-97da-902b19369533\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269264 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-additional-scripts\") pod \"38af9635-bc06-4376-97da-902b19369533\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269290 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-scripts\") pod \"38af9635-bc06-4376-97da-902b19369533\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269411 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run-ovn\") pod \"38af9635-bc06-4376-97da-902b19369533\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269497 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run\") pod \"38af9635-bc06-4376-97da-902b19369533\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269521 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-log-ovn\") pod \"38af9635-bc06-4376-97da-902b19369533\" (UID: \"38af9635-bc06-4376-97da-902b19369533\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.269577 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8ssm\" (UniqueName: \"kubernetes.io/projected/54d827f7-8cc5-41fc-a453-880a27418b98-kube-api-access-p8ssm\") pod \"54d827f7-8cc5-41fc-a453-880a27418b98\" (UID: \"54d827f7-8cc5-41fc-a453-880a27418b98\") " Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.270094 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "38af9635-bc06-4376-97da-902b19369533" (UID: "38af9635-bc06-4376-97da-902b19369533"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.270154 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "38af9635-bc06-4376-97da-902b19369533" (UID: "38af9635-bc06-4376-97da-902b19369533"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.270399 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run" (OuterVolumeSpecName: "var-run") pod "38af9635-bc06-4376-97da-902b19369533" (UID: "38af9635-bc06-4376-97da-902b19369533"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.270578 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "38af9635-bc06-4376-97da-902b19369533" (UID: "38af9635-bc06-4376-97da-902b19369533"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.271196 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-scripts" (OuterVolumeSpecName: "scripts") pod "38af9635-bc06-4376-97da-902b19369533" (UID: "38af9635-bc06-4376-97da-902b19369533"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.274869 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38af9635-bc06-4376-97da-902b19369533-kube-api-access-ss9kv" (OuterVolumeSpecName: "kube-api-access-ss9kv") pod "38af9635-bc06-4376-97da-902b19369533" (UID: "38af9635-bc06-4376-97da-902b19369533"). InnerVolumeSpecName "kube-api-access-ss9kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.277442 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d827f7-8cc5-41fc-a453-880a27418b98-kube-api-access-p8ssm" (OuterVolumeSpecName: "kube-api-access-p8ssm") pod "54d827f7-8cc5-41fc-a453-880a27418b98" (UID: "54d827f7-8cc5-41fc-a453-880a27418b98"). InnerVolumeSpecName "kube-api-access-p8ssm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.371816 4797 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.372146 4797 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.372161 4797 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38af9635-bc06-4376-97da-902b19369533-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.372175 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8ssm\" (UniqueName: \"kubernetes.io/projected/54d827f7-8cc5-41fc-a453-880a27418b98-kube-api-access-p8ssm\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.372190 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9kv\" (UniqueName: \"kubernetes.io/projected/38af9635-bc06-4376-97da-902b19369533-kube-api-access-ss9kv\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.372201 4797 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.372213 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38af9635-bc06-4376-97da-902b19369533-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.609397 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nf4pk-config-hnjl7" event={"ID":"38af9635-bc06-4376-97da-902b19369533","Type":"ContainerDied","Data":"ebe7ad6b05574dddd33d00860671cebd50f35eea7b2dd63a6a4e441146e92ff2"} Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.609455 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nf4pk-config-hnjl7" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.609470 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebe7ad6b05574dddd33d00860671cebd50f35eea7b2dd63a6a4e441146e92ff2" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.611908 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96eb-account-create-5nvh4" event={"ID":"54d827f7-8cc5-41fc-a453-880a27418b98","Type":"ContainerDied","Data":"7f1d9e7a19e099fb81822a900770cc0e62ccb3509190a77b6a85b11c444b05ee"} Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.611951 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f1d9e7a19e099fb81822a900770cc0e62ccb3509190a77b6a85b11c444b05ee" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.611991 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96eb-account-create-5nvh4" Sep 30 18:00:42 crc kubenswrapper[4797]: I0930 18:00:42.777055 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e107-account-create-tcs6r"] Sep 30 18:00:42 crc kubenswrapper[4797]: W0930 18:00:42.781663 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f0af8c_9d47_4ab5_807d_7853398fcf78.slice/crio-6a07e0bbd537985faaac3e4a8ad39c5eebb2ee93e4408202d2238da4ba410135 WatchSource:0}: Error finding container 6a07e0bbd537985faaac3e4a8ad39c5eebb2ee93e4408202d2238da4ba410135: Status 404 returned error can't find the container with id 6a07e0bbd537985faaac3e4a8ad39c5eebb2ee93e4408202d2238da4ba410135 Sep 30 18:00:43 crc kubenswrapper[4797]: I0930 18:00:43.086093 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nf4pk" Sep 30 18:00:43 crc kubenswrapper[4797]: I0930 18:00:43.251577 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nf4pk-config-hnjl7"] Sep 30 18:00:43 crc kubenswrapper[4797]: I0930 18:00:43.259053 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nf4pk-config-hnjl7"] Sep 30 18:00:43 crc kubenswrapper[4797]: I0930 18:00:43.621353 4797 generic.go:334] "Generic (PLEG): container finished" podID="b1f0af8c-9d47-4ab5-807d-7853398fcf78" containerID="83b26af432cda3863ffd7d53c2a3b673abd70ad500ede9638dc17372ac9581f5" exitCode=0 Sep 30 18:00:43 crc kubenswrapper[4797]: I0930 18:00:43.621410 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e107-account-create-tcs6r" event={"ID":"b1f0af8c-9d47-4ab5-807d-7853398fcf78","Type":"ContainerDied","Data":"83b26af432cda3863ffd7d53c2a3b673abd70ad500ede9638dc17372ac9581f5"} Sep 30 18:00:43 crc kubenswrapper[4797]: I0930 18:00:43.621500 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e107-account-create-tcs6r" event={"ID":"b1f0af8c-9d47-4ab5-807d-7853398fcf78","Type":"ContainerStarted","Data":"6a07e0bbd537985faaac3e4a8ad39c5eebb2ee93e4408202d2238da4ba410135"} Sep 30 18:00:44 crc kubenswrapper[4797]: I0930 18:00:44.192276 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:00:44 crc kubenswrapper[4797]: I0930 18:00:44.192638 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:00:44 crc kubenswrapper[4797]: I0930 18:00:44.253413 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38af9635-bc06-4376-97da-902b19369533" path="/var/lib/kubelet/pods/38af9635-bc06-4376-97da-902b19369533/volumes" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.055663 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.219393 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.219666 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.221745 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dwb4\" (UniqueName: \"kubernetes.io/projected/b1f0af8c-9d47-4ab5-807d-7853398fcf78-kube-api-access-9dwb4\") pod \"b1f0af8c-9d47-4ab5-807d-7853398fcf78\" (UID: \"b1f0af8c-9d47-4ab5-807d-7853398fcf78\") " Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.224709 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.231329 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f0af8c-9d47-4ab5-807d-7853398fcf78-kube-api-access-9dwb4" (OuterVolumeSpecName: "kube-api-access-9dwb4") pod "b1f0af8c-9d47-4ab5-807d-7853398fcf78" (UID: "b1f0af8c-9d47-4ab5-807d-7853398fcf78"). InnerVolumeSpecName "kube-api-access-9dwb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.323893 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dwb4\" (UniqueName: \"kubernetes.io/projected/b1f0af8c-9d47-4ab5-807d-7853398fcf78-kube-api-access-9dwb4\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.646535 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e107-account-create-tcs6r" event={"ID":"b1f0af8c-9d47-4ab5-807d-7853398fcf78","Type":"ContainerDied","Data":"6a07e0bbd537985faaac3e4a8ad39c5eebb2ee93e4408202d2238da4ba410135"} Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.646597 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a07e0bbd537985faaac3e4a8ad39c5eebb2ee93e4408202d2238da4ba410135" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.646551 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e107-account-create-tcs6r" Sep 30 18:00:45 crc kubenswrapper[4797]: I0930 18:00:45.648763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.275799 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.284698 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fecceec-298d-4979-b468-5fe35c9b68e7-etc-swift\") pod \"swift-storage-0\" (UID: \"4fecceec-298d-4979-b468-5fe35c9b68e7\") " pod="openstack/swift-storage-0" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.375627 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-f899g"] Sep 30 18:00:47 crc kubenswrapper[4797]: E0930 18:00:47.376190 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d827f7-8cc5-41fc-a453-880a27418b98" containerName="mariadb-account-create" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.376285 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d827f7-8cc5-41fc-a453-880a27418b98" containerName="mariadb-account-create" Sep 30 18:00:47 crc kubenswrapper[4797]: E0930 18:00:47.376352 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f0af8c-9d47-4ab5-807d-7853398fcf78" containerName="mariadb-account-create" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.376414 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f0af8c-9d47-4ab5-807d-7853398fcf78" containerName="mariadb-account-create" Sep 30 18:00:47 crc kubenswrapper[4797]: E0930 18:00:47.376529 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38af9635-bc06-4376-97da-902b19369533" containerName="ovn-config" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.376597 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="38af9635-bc06-4376-97da-902b19369533" containerName="ovn-config" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.376821 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d827f7-8cc5-41fc-a453-880a27418b98" containerName="mariadb-account-create" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.376884 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="38af9635-bc06-4376-97da-902b19369533" containerName="ovn-config" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.376957 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f0af8c-9d47-4ab5-807d-7853398fcf78" containerName="mariadb-account-create" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.377709 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.380267 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n2zcg" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.380530 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.390426 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-f899g"] Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.480646 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpnb\" (UniqueName: \"kubernetes.io/projected/fe16203f-60b5-483f-83b5-1d26b25292c9-kube-api-access-8vpnb\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.480712 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-combined-ca-bundle\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.480744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-db-sync-config-data\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.480807 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-config-data\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.493275 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.582018 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-combined-ca-bundle\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.582072 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-db-sync-config-data\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.582100 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-config-data\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.582189 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpnb\" (UniqueName: \"kubernetes.io/projected/fe16203f-60b5-483f-83b5-1d26b25292c9-kube-api-access-8vpnb\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.586208 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-config-data\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.586792 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-combined-ca-bundle\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.587903 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-db-sync-config-data\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.599944 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpnb\" (UniqueName: \"kubernetes.io/projected/fe16203f-60b5-483f-83b5-1d26b25292c9-kube-api-access-8vpnb\") pod \"glance-db-sync-f899g\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " pod="openstack/glance-db-sync-f899g" Sep 30 18:00:47 crc kubenswrapper[4797]: I0930 18:00:47.703814 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f899g" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.095405 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 18:00:48 crc kubenswrapper[4797]: W0930 18:00:48.099161 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fecceec_298d_4979_b468_5fe35c9b68e7.slice/crio-31640323eef5189eb94eb4534c7770e00017b249f24ea3b1cc81f86933d7499a WatchSource:0}: Error finding container 31640323eef5189eb94eb4534c7770e00017b249f24ea3b1cc81f86933d7499a: Status 404 returned error can't find the container with id 31640323eef5189eb94eb4534c7770e00017b249f24ea3b1cc81f86933d7499a Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.255102 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.368654 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.623975 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.683697 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"31640323eef5189eb94eb4534c7770e00017b249f24ea3b1cc81f86933d7499a"} Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.683897 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="prometheus" containerID="cri-o://22131c6a5fa0809a5686cd490b3bff816c4adae27c7c7ba69b0e39a7803fa7d0" gracePeriod=600 Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.683933 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="thanos-sidecar" containerID="cri-o://3cb3fb1841e87f3f017879f644e497d684eedf449cbb5910af9c64d6a36980d2" gracePeriod=600 Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.684008 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="config-reloader" containerID="cri-o://573b550a3cbf98d75ef2ead90adcddfbd1f7a828fc537cdba9456a9a513ea048" gracePeriod=600 Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.751124 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kfmjd"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.752422 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.776048 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kfmjd"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.786052 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-g9tqz"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.788694 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.791154 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.791426 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-f2w2l" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.839902 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-g9tqz"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.914639 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-kmcqt"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.915851 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.923876 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-config-data\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.923927 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l544t\" (UniqueName: \"kubernetes.io/projected/d7f8ded7-5fab-43fb-8d0f-f514889b5640-kube-api-access-l544t\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.924008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfj86\" (UniqueName: \"kubernetes.io/projected/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f-kube-api-access-lfj86\") pod \"cinder-db-create-kfmjd\" (UID: \"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f\") " pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.924049 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-combined-ca-bundle\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.924077 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-db-sync-config-data\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.938812 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kmcqt"] Sep 30 18:00:48 crc kubenswrapper[4797]: I0930 18:00:48.974315 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-f899g"] Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.027639 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-db-sync-config-data\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.027764 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75699\" (UniqueName: \"kubernetes.io/projected/dd66af3e-b824-414c-abd8-339c62a9e9e1-kube-api-access-75699\") pod \"barbican-db-create-kmcqt\" (UID: \"dd66af3e-b824-414c-abd8-339c62a9e9e1\") " pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.027863 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-config-data\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.027907 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l544t\" (UniqueName: \"kubernetes.io/projected/d7f8ded7-5fab-43fb-8d0f-f514889b5640-kube-api-access-l544t\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.027996 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfj86\" (UniqueName: \"kubernetes.io/projected/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f-kube-api-access-lfj86\") pod \"cinder-db-create-kfmjd\" (UID: \"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f\") " pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.028052 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-combined-ca-bundle\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.036348 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-db-sync-config-data\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.036672 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-config-data\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.042147 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-combined-ca-bundle\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.050400 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l544t\" (UniqueName: \"kubernetes.io/projected/d7f8ded7-5fab-43fb-8d0f-f514889b5640-kube-api-access-l544t\") pod \"watcher-db-sync-g9tqz\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.053309 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfj86\" (UniqueName: \"kubernetes.io/projected/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f-kube-api-access-lfj86\") pod \"cinder-db-create-kfmjd\" (UID: \"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f\") " pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.062403 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fqpjz"] Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.063689 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fqpjz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.069234 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fqpjz"] Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.086159 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.138376 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75699\" (UniqueName: \"kubernetes.io/projected/dd66af3e-b824-414c-abd8-339c62a9e9e1-kube-api-access-75699\") pod \"barbican-db-create-kmcqt\" (UID: \"dd66af3e-b824-414c-abd8-339c62a9e9e1\") " pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.167262 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jqh9l"] Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.168399 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.170346 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.171036 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8tsjk" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.173281 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jqh9l"] Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.179655 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.183499 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75699\" (UniqueName: \"kubernetes.io/projected/dd66af3e-b824-414c-abd8-339c62a9e9e1-kube-api-access-75699\") pod \"barbican-db-create-kmcqt\" (UID: \"dd66af3e-b824-414c-abd8-339c62a9e9e1\") " pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.183579 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.185893 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.241293 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.241906 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchq8\" (UniqueName: \"kubernetes.io/projected/f2a0c476-b584-4818-a7c0-a5da97aaf4df-kube-api-access-bchq8\") pod \"neutron-db-create-fqpjz\" (UID: \"f2a0c476-b584-4818-a7c0-a5da97aaf4df\") " pod="openstack/neutron-db-create-fqpjz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.347054 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchq8\" (UniqueName: \"kubernetes.io/projected/f2a0c476-b584-4818-a7c0-a5da97aaf4df-kube-api-access-bchq8\") pod \"neutron-db-create-fqpjz\" (UID: \"f2a0c476-b584-4818-a7c0-a5da97aaf4df\") " pod="openstack/neutron-db-create-fqpjz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.347137 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bcww\" (UniqueName: \"kubernetes.io/projected/f34d2759-ae6e-43a3-8010-80596a570a37-kube-api-access-5bcww\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.347177 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-config-data\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.347217 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-combined-ca-bundle\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.390214 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchq8\" (UniqueName: \"kubernetes.io/projected/f2a0c476-b584-4818-a7c0-a5da97aaf4df-kube-api-access-bchq8\") pod \"neutron-db-create-fqpjz\" (UID: \"f2a0c476-b584-4818-a7c0-a5da97aaf4df\") " pod="openstack/neutron-db-create-fqpjz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.451021 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bcww\" (UniqueName: \"kubernetes.io/projected/f34d2759-ae6e-43a3-8010-80596a570a37-kube-api-access-5bcww\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.451097 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-config-data\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.451144 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-combined-ca-bundle\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.460340 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-config-data\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.476760 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bcww\" (UniqueName: \"kubernetes.io/projected/f34d2759-ae6e-43a3-8010-80596a570a37-kube-api-access-5bcww\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.477028 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-combined-ca-bundle\") pod \"keystone-db-sync-jqh9l\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.485518 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fqpjz" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.495237 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.696774 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kfmjd"] Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.714936 4797 generic.go:334] "Generic (PLEG): container finished" podID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerID="3cb3fb1841e87f3f017879f644e497d684eedf449cbb5910af9c64d6a36980d2" exitCode=0 Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.714965 4797 generic.go:334] "Generic (PLEG): container finished" podID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerID="573b550a3cbf98d75ef2ead90adcddfbd1f7a828fc537cdba9456a9a513ea048" exitCode=0 Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.714975 4797 generic.go:334] "Generic (PLEG): container finished" podID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerID="22131c6a5fa0809a5686cd490b3bff816c4adae27c7c7ba69b0e39a7803fa7d0" exitCode=0 Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.715012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerDied","Data":"3cb3fb1841e87f3f017879f644e497d684eedf449cbb5910af9c64d6a36980d2"} Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.715036 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerDied","Data":"573b550a3cbf98d75ef2ead90adcddfbd1f7a828fc537cdba9456a9a513ea048"} Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.715048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerDied","Data":"22131c6a5fa0809a5686cd490b3bff816c4adae27c7c7ba69b0e39a7803fa7d0"} Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.737659 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f899g" event={"ID":"fe16203f-60b5-483f-83b5-1d26b25292c9","Type":"ContainerStarted","Data":"9eab8b9b6e738585766f4b6640833667f65fa49131767f074f6b3f413eab1c66"} Sep 30 18:00:49 crc kubenswrapper[4797]: W0930 18:00:49.753551 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4df77de6_491d_4cbb_a8e4_ce74e0e99e9f.slice/crio-59912883e962a33841fa3be134b3433265a741301efac5c7c37c804f4580b7ec WatchSource:0}: Error finding container 59912883e962a33841fa3be134b3433265a741301efac5c7c37c804f4580b7ec: Status 404 returned error can't find the container with id 59912883e962a33841fa3be134b3433265a741301efac5c7c37c804f4580b7ec Sep 30 18:00:49 crc kubenswrapper[4797]: I0930 18:00:49.895995 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-g9tqz"] Sep 30 18:00:49 crc kubenswrapper[4797]: W0930 18:00:49.969150 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f8ded7_5fab_43fb_8d0f_f514889b5640.slice/crio-f5cff9c0a690f1fd3c6aacc68cde3c7368f62db061ec7166f2ab8c8d8f4710f8 WatchSource:0}: Error finding container f5cff9c0a690f1fd3c6aacc68cde3c7368f62db061ec7166f2ab8c8d8f4710f8: Status 404 returned error can't find the container with id f5cff9c0a690f1fd3c6aacc68cde3c7368f62db061ec7166f2ab8c8d8f4710f8 Sep 30 18:00:50 crc kubenswrapper[4797]: I0930 18:00:50.004195 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kmcqt"] Sep 30 18:00:50 crc kubenswrapper[4797]: W0930 18:00:50.028720 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd66af3e_b824_414c_abd8_339c62a9e9e1.slice/crio-81c9ca4edb2b416fea418b2de1815abfadc029ccd18662e57b1d0b0328386e49 WatchSource:0}: Error finding container 81c9ca4edb2b416fea418b2de1815abfadc029ccd18662e57b1d0b0328386e49: Status 404 returned error can't find the container with id 81c9ca4edb2b416fea418b2de1815abfadc029ccd18662e57b1d0b0328386e49 Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.148938 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.269452 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-thanos-prometheus-http-client-file\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw6kp\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-kube-api-access-cw6kp\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270467 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-tls-assets\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270615 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-web-config\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270763 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270790 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-prometheus-metric-storage-rulefiles-0\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270819 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.270870 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config-out\") pod \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\" (UID: \"433bd6ef-bdef-4c2b-9fb3-019fecae8b40\") " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.271919 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.275919 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config-out" (OuterVolumeSpecName: "config-out") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.278904 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-kube-api-access-cw6kp" (OuterVolumeSpecName: "kube-api-access-cw6kp") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "kube-api-access-cw6kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.280486 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.282248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.288247 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config" (OuterVolumeSpecName: "config") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.310361 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.312475 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jqh9l"] Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.351145 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-web-config" (OuterVolumeSpecName: "web-config") pod "433bd6ef-bdef-4c2b-9fb3-019fecae8b40" (UID: "433bd6ef-bdef-4c2b-9fb3-019fecae8b40"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.373924 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.373983 4797 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.374002 4797 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.374019 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw6kp\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-kube-api-access-cw6kp\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.374033 4797 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.374042 4797 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.374094 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") on node \"crc\" " Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.374109 4797 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/433bd6ef-bdef-4c2b-9fb3-019fecae8b40-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.412159 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fqpjz"] Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.444716 4797 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.444884 4797 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf") on node "crc" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.477991 4797 reconciler_common.go:293] "Volume detached for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.751823 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fqpjz" event={"ID":"f2a0c476-b584-4818-a7c0-a5da97aaf4df","Type":"ContainerStarted","Data":"f18341b0bbc374dfc82b2c735ef7c50562f756492be5b4cd517db723eb4e3420"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.753887 4797 generic.go:334] "Generic (PLEG): container finished" podID="4df77de6-491d-4cbb-a8e4-ce74e0e99e9f" containerID="d336439f2abc7c706dbf6b4deba195f1732b63db8db4cf6d2ffa2ccd4eb61ff8" exitCode=0 Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.753923 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kfmjd" event={"ID":"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f","Type":"ContainerDied","Data":"d336439f2abc7c706dbf6b4deba195f1732b63db8db4cf6d2ffa2ccd4eb61ff8"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.753938 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kfmjd" event={"ID":"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f","Type":"ContainerStarted","Data":"59912883e962a33841fa3be134b3433265a741301efac5c7c37c804f4580b7ec"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.760365 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"433bd6ef-bdef-4c2b-9fb3-019fecae8b40","Type":"ContainerDied","Data":"f675b79a7a7293f6cfd5c58b833e7ecebd856f9a61fdcbb5d20f75088205b2bf"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.760416 4797 scope.go:117] "RemoveContainer" containerID="3cb3fb1841e87f3f017879f644e497d684eedf449cbb5910af9c64d6a36980d2" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.760255 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.764550 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqh9l" event={"ID":"f34d2759-ae6e-43a3-8010-80596a570a37","Type":"ContainerStarted","Data":"751411a8428c83d8ccba3c558a352b322228318778c9641738562b5ca637227d"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.769394 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-g9tqz" event={"ID":"d7f8ded7-5fab-43fb-8d0f-f514889b5640","Type":"ContainerStarted","Data":"f5cff9c0a690f1fd3c6aacc68cde3c7368f62db061ec7166f2ab8c8d8f4710f8"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.773868 4797 generic.go:334] "Generic (PLEG): container finished" podID="dd66af3e-b824-414c-abd8-339c62a9e9e1" containerID="15d0eb2786741206fc82ff61e17abb660bb8d2e41e57f0e72658349b2a957ad3" exitCode=0 Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.773904 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kmcqt" event={"ID":"dd66af3e-b824-414c-abd8-339c62a9e9e1","Type":"ContainerDied","Data":"15d0eb2786741206fc82ff61e17abb660bb8d2e41e57f0e72658349b2a957ad3"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.773926 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kmcqt" event={"ID":"dd66af3e-b824-414c-abd8-339c62a9e9e1","Type":"ContainerStarted","Data":"81c9ca4edb2b416fea418b2de1815abfadc029ccd18662e57b1d0b0328386e49"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.824919 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.848332 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.869660 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:00:51 crc kubenswrapper[4797]: E0930 18:00:50.870064 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="config-reloader" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870081 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="config-reloader" Sep 30 18:00:51 crc kubenswrapper[4797]: E0930 18:00:50.870115 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="init-config-reloader" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870123 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="init-config-reloader" Sep 30 18:00:51 crc kubenswrapper[4797]: E0930 18:00:50.870137 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="thanos-sidecar" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870144 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="thanos-sidecar" Sep 30 18:00:51 crc kubenswrapper[4797]: E0930 18:00:50.870162 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="prometheus" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870169 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="prometheus" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870360 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="prometheus" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870384 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="thanos-sidecar" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.870400 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" containerName="config-reloader" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.872419 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.875687 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.880946 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.881027 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.881199 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.881920 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-n6tgk" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.888202 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.888894 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.905337 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.990948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991000 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991023 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991057 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991083 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991159 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/572d2f77-3315-4b90-860e-18d1973993ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991175 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991199 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgls\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-kube-api-access-5hgls\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991243 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/572d2f77-3315-4b90-860e-18d1973993ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:50.991268 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093676 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093755 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/572d2f77-3315-4b90-860e-18d1973993ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093822 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgls\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-kube-api-access-5hgls\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093879 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/572d2f77-3315-4b90-860e-18d1973993ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093909 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093945 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.093994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.094027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.094058 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.095696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/572d2f77-3315-4b90-860e-18d1973993ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.098989 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.100473 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/572d2f77-3315-4b90-860e-18d1973993ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.100868 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.102406 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.102450 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dc90504e8fd72bffa4b364aa4d9dd59b6e7cea028ec589d2165a3270de0ac3cc/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.103244 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.104888 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.106033 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.107899 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.119716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgls\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-kube-api-access-5hgls\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.121768 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.186794 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.228112 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.346378 4797 scope.go:117] "RemoveContainer" containerID="573b550a3cbf98d75ef2ead90adcddfbd1f7a828fc537cdba9456a9a513ea048" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.415616 4797 scope.go:117] "RemoveContainer" containerID="22131c6a5fa0809a5686cd490b3bff816c4adae27c7c7ba69b0e39a7803fa7d0" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.550490 4797 scope.go:117] "RemoveContainer" containerID="2e37114ce23ab91d58d22daaf5fcdf98ae15ccc588feae39c9852b4c5efa4111" Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.801893 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"faa782b29e25a9cc20ac9ace07eec33b246ce6f64dd91838f7c10bf1e5a261f9"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.810663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fqpjz" event={"ID":"f2a0c476-b584-4818-a7c0-a5da97aaf4df","Type":"ContainerStarted","Data":"0934b925c8771a692e7fa91a3c882d8aeeb188e15c2a26ad1b0b2f6da26823d6"} Sep 30 18:00:51 crc kubenswrapper[4797]: I0930 18:00:51.832062 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-fqpjz" podStartSLOduration=2.832045586 podStartE2EDuration="2.832045586s" podCreationTimestamp="2025-09-30 18:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:00:51.830130204 +0000 UTC m=+1102.352629442" watchObservedRunningTime="2025-09-30 18:00:51.832045586 +0000 UTC m=+1102.354544824" Sep 30 18:00:52 crc kubenswrapper[4797]: I0930 18:00:52.008243 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:00:52 crc kubenswrapper[4797]: I0930 18:00:52.255097 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="433bd6ef-bdef-4c2b-9fb3-019fecae8b40" path="/var/lib/kubelet/pods/433bd6ef-bdef-4c2b-9fb3-019fecae8b40/volumes" Sep 30 18:00:52 crc kubenswrapper[4797]: I0930 18:00:52.848093 4797 generic.go:334] "Generic (PLEG): container finished" podID="f2a0c476-b584-4818-a7c0-a5da97aaf4df" containerID="0934b925c8771a692e7fa91a3c882d8aeeb188e15c2a26ad1b0b2f6da26823d6" exitCode=0 Sep 30 18:00:52 crc kubenswrapper[4797]: I0930 18:00:52.848144 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fqpjz" event={"ID":"f2a0c476-b584-4818-a7c0-a5da97aaf4df","Type":"ContainerDied","Data":"0934b925c8771a692e7fa91a3c882d8aeeb188e15c2a26ad1b0b2f6da26823d6"} Sep 30 18:00:53 crc kubenswrapper[4797]: W0930 18:00:53.121569 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572d2f77_3315_4b90_860e_18d1973993ef.slice/crio-10f7865b7fb770d1c7a973fc6284b253a3eb9edbd9d7f03aced4aee64c06a30e WatchSource:0}: Error finding container 10f7865b7fb770d1c7a973fc6284b253a3eb9edbd9d7f03aced4aee64c06a30e: Status 404 returned error can't find the container with id 10f7865b7fb770d1c7a973fc6284b253a3eb9edbd9d7f03aced4aee64c06a30e Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.275549 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.277558 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.338023 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfj86\" (UniqueName: \"kubernetes.io/projected/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f-kube-api-access-lfj86\") pod \"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f\" (UID: \"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f\") " Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.338238 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75699\" (UniqueName: \"kubernetes.io/projected/dd66af3e-b824-414c-abd8-339c62a9e9e1-kube-api-access-75699\") pod \"dd66af3e-b824-414c-abd8-339c62a9e9e1\" (UID: \"dd66af3e-b824-414c-abd8-339c62a9e9e1\") " Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.342621 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd66af3e-b824-414c-abd8-339c62a9e9e1-kube-api-access-75699" (OuterVolumeSpecName: "kube-api-access-75699") pod "dd66af3e-b824-414c-abd8-339c62a9e9e1" (UID: "dd66af3e-b824-414c-abd8-339c62a9e9e1"). InnerVolumeSpecName "kube-api-access-75699". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.343845 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f-kube-api-access-lfj86" (OuterVolumeSpecName: "kube-api-access-lfj86") pod "4df77de6-491d-4cbb-a8e4-ce74e0e99e9f" (UID: "4df77de6-491d-4cbb-a8e4-ce74e0e99e9f"). InnerVolumeSpecName "kube-api-access-lfj86". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.442416 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75699\" (UniqueName: \"kubernetes.io/projected/dd66af3e-b824-414c-abd8-339c62a9e9e1-kube-api-access-75699\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.442467 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfj86\" (UniqueName: \"kubernetes.io/projected/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f-kube-api-access-lfj86\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.858115 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerStarted","Data":"10f7865b7fb770d1c7a973fc6284b253a3eb9edbd9d7f03aced4aee64c06a30e"} Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.865222 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kmcqt" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.867812 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kmcqt" event={"ID":"dd66af3e-b824-414c-abd8-339c62a9e9e1","Type":"ContainerDied","Data":"81c9ca4edb2b416fea418b2de1815abfadc029ccd18662e57b1d0b0328386e49"} Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.867860 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81c9ca4edb2b416fea418b2de1815abfadc029ccd18662e57b1d0b0328386e49" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.872124 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kfmjd" Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.877507 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kfmjd" event={"ID":"4df77de6-491d-4cbb-a8e4-ce74e0e99e9f","Type":"ContainerDied","Data":"59912883e962a33841fa3be134b3433265a741301efac5c7c37c804f4580b7ec"} Sep 30 18:00:53 crc kubenswrapper[4797]: I0930 18:00:53.877566 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59912883e962a33841fa3be134b3433265a741301efac5c7c37c804f4580b7ec" Sep 30 18:00:56 crc kubenswrapper[4797]: I0930 18:00:56.905349 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fqpjz" event={"ID":"f2a0c476-b584-4818-a7c0-a5da97aaf4df","Type":"ContainerDied","Data":"f18341b0bbc374dfc82b2c735ef7c50562f756492be5b4cd517db723eb4e3420"} Sep 30 18:00:56 crc kubenswrapper[4797]: I0930 18:00:56.906206 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f18341b0bbc374dfc82b2c735ef7c50562f756492be5b4cd517db723eb4e3420" Sep 30 18:00:56 crc kubenswrapper[4797]: I0930 18:00:56.926455 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fqpjz" Sep 30 18:00:57 crc kubenswrapper[4797]: I0930 18:00:57.029693 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bchq8\" (UniqueName: \"kubernetes.io/projected/f2a0c476-b584-4818-a7c0-a5da97aaf4df-kube-api-access-bchq8\") pod \"f2a0c476-b584-4818-a7c0-a5da97aaf4df\" (UID: \"f2a0c476-b584-4818-a7c0-a5da97aaf4df\") " Sep 30 18:00:57 crc kubenswrapper[4797]: I0930 18:00:57.034687 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a0c476-b584-4818-a7c0-a5da97aaf4df-kube-api-access-bchq8" (OuterVolumeSpecName: "kube-api-access-bchq8") pod "f2a0c476-b584-4818-a7c0-a5da97aaf4df" (UID: "f2a0c476-b584-4818-a7c0-a5da97aaf4df"). InnerVolumeSpecName "kube-api-access-bchq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:57 crc kubenswrapper[4797]: I0930 18:00:57.132205 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bchq8\" (UniqueName: \"kubernetes.io/projected/f2a0c476-b584-4818-a7c0-a5da97aaf4df-kube-api-access-bchq8\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:57 crc kubenswrapper[4797]: I0930 18:00:57.926773 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fqpjz" Sep 30 18:01:04 crc kubenswrapper[4797]: E0930 18:01:04.469822 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Sep 30 18:01:04 crc kubenswrapper[4797]: E0930 18:01:04.470418 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bcww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-jqh9l_openstack(f34d2759-ae6e-43a3-8010-80596a570a37): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:01:04 crc kubenswrapper[4797]: E0930 18:01:04.472609 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/keystone-db-sync-jqh9l" podUID="f34d2759-ae6e-43a3-8010-80596a570a37" Sep 30 18:01:04 crc kubenswrapper[4797]: E0930 18:01:04.983339 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="openstack/keystone-db-sync-jqh9l" podUID="f34d2759-ae6e-43a3-8010-80596a570a37" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.782353 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-de80-account-create-v87xg"] Sep 30 18:01:08 crc kubenswrapper[4797]: E0930 18:01:08.783641 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df77de6-491d-4cbb-a8e4-ce74e0e99e9f" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.783665 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df77de6-491d-4cbb-a8e4-ce74e0e99e9f" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: E0930 18:01:08.783697 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd66af3e-b824-414c-abd8-339c62a9e9e1" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.783711 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd66af3e-b824-414c-abd8-339c62a9e9e1" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: E0930 18:01:08.783745 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a0c476-b584-4818-a7c0-a5da97aaf4df" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.783759 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a0c476-b584-4818-a7c0-a5da97aaf4df" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.784080 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df77de6-491d-4cbb-a8e4-ce74e0e99e9f" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.784117 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a0c476-b584-4818-a7c0-a5da97aaf4df" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.784145 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd66af3e-b824-414c-abd8-339c62a9e9e1" containerName="mariadb-database-create" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.785235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.786667 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.791557 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-de80-account-create-v87xg"] Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.866506 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lw7\" (UniqueName: \"kubernetes.io/projected/cc55cc29-cee8-4470-a6f6-398ff183ba0b-kube-api-access-q6lw7\") pod \"cinder-de80-account-create-v87xg\" (UID: \"cc55cc29-cee8-4470-a6f6-398ff183ba0b\") " pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.969537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lw7\" (UniqueName: \"kubernetes.io/projected/cc55cc29-cee8-4470-a6f6-398ff183ba0b-kube-api-access-q6lw7\") pod \"cinder-de80-account-create-v87xg\" (UID: \"cc55cc29-cee8-4470-a6f6-398ff183ba0b\") " pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.980709 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bbfa-account-create-gjc2s"] Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.982016 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.984276 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.989388 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bbfa-account-create-gjc2s"] Sep 30 18:01:08 crc kubenswrapper[4797]: I0930 18:01:08.991201 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lw7\" (UniqueName: \"kubernetes.io/projected/cc55cc29-cee8-4470-a6f6-398ff183ba0b-kube-api-access-q6lw7\") pod \"cinder-de80-account-create-v87xg\" (UID: \"cc55cc29-cee8-4470-a6f6-398ff183ba0b\") " pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.070994 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glvlc\" (UniqueName: \"kubernetes.io/projected/499f35f7-af0b-4713-858e-ec4123da64a0-kube-api-access-glvlc\") pod \"barbican-bbfa-account-create-gjc2s\" (UID: \"499f35f7-af0b-4713-858e-ec4123da64a0\") " pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.110688 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.187838 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glvlc\" (UniqueName: \"kubernetes.io/projected/499f35f7-af0b-4713-858e-ec4123da64a0-kube-api-access-glvlc\") pod \"barbican-bbfa-account-create-gjc2s\" (UID: \"499f35f7-af0b-4713-858e-ec4123da64a0\") " pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.205458 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-932a-account-create-pq6wq"] Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.206950 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.209341 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.213806 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-932a-account-create-pq6wq"] Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.216378 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glvlc\" (UniqueName: \"kubernetes.io/projected/499f35f7-af0b-4713-858e-ec4123da64a0-kube-api-access-glvlc\") pod \"barbican-bbfa-account-create-gjc2s\" (UID: \"499f35f7-af0b-4713-858e-ec4123da64a0\") " pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.289312 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69nt\" (UniqueName: \"kubernetes.io/projected/8f0627c8-4235-4c74-81eb-aef495551b9f-kube-api-access-z69nt\") pod \"neutron-932a-account-create-pq6wq\" (UID: \"8f0627c8-4235-4c74-81eb-aef495551b9f\") " pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.344055 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.391681 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69nt\" (UniqueName: \"kubernetes.io/projected/8f0627c8-4235-4c74-81eb-aef495551b9f-kube-api-access-z69nt\") pod \"neutron-932a-account-create-pq6wq\" (UID: \"8f0627c8-4235-4c74-81eb-aef495551b9f\") " pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.407851 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69nt\" (UniqueName: \"kubernetes.io/projected/8f0627c8-4235-4c74-81eb-aef495551b9f-kube-api-access-z69nt\") pod \"neutron-932a-account-create-pq6wq\" (UID: \"8f0627c8-4235-4c74-81eb-aef495551b9f\") " pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:09 crc kubenswrapper[4797]: I0930 18:01:09.590188 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:11 crc kubenswrapper[4797]: E0930 18:01:11.464180 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Sep 30 18:01:11 crc kubenswrapper[4797]: E0930 18:01:11.464681 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vpnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-f899g_openstack(fe16203f-60b5-483f-83b5-1d26b25292c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:01:11 crc kubenswrapper[4797]: E0930 18:01:11.465953 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-f899g" podUID="fe16203f-60b5-483f-83b5-1d26b25292c9" Sep 30 18:01:11 crc kubenswrapper[4797]: I0930 18:01:11.997184 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-de80-account-create-v87xg"] Sep 30 18:01:12 crc kubenswrapper[4797]: W0930 18:01:12.004831 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc55cc29_cee8_4470_a6f6_398ff183ba0b.slice/crio-d35329b89d79a80eed6ff3d7b05c974cc4c8c57b6604e0d1524c4e6c3bd5676f WatchSource:0}: Error finding container d35329b89d79a80eed6ff3d7b05c974cc4c8c57b6604e0d1524c4e6c3bd5676f: Status 404 returned error can't find the container with id d35329b89d79a80eed6ff3d7b05c974cc4c8c57b6604e0d1524c4e6c3bd5676f Sep 30 18:01:12 crc kubenswrapper[4797]: W0930 18:01:12.009419 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f0627c8_4235_4c74_81eb_aef495551b9f.slice/crio-28c58aa10a71e4796152e6c099f946af99da128b0117b316a536cfa5c2b3c282 WatchSource:0}: Error finding container 28c58aa10a71e4796152e6c099f946af99da128b0117b316a536cfa5c2b3c282: Status 404 returned error can't find the container with id 28c58aa10a71e4796152e6c099f946af99da128b0117b316a536cfa5c2b3c282 Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.010154 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-932a-account-create-pq6wq"] Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.085464 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-g9tqz" event={"ID":"d7f8ded7-5fab-43fb-8d0f-f514889b5640","Type":"ContainerStarted","Data":"67a7976a1409b78457acdbd2edd3f78545964324b8aee1c4d4ea1dd1782c8b34"} Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.086748 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-932a-account-create-pq6wq" event={"ID":"8f0627c8-4235-4c74-81eb-aef495551b9f","Type":"ContainerStarted","Data":"28c58aa10a71e4796152e6c099f946af99da128b0117b316a536cfa5c2b3c282"} Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.092046 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"611b32dc023f9bf23dba38d437b7f1913a9b13d94c9002606df2c9773a8ac69d"} Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.092092 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"081ed2d206ed91d288940e5b1ccae40e2ab681a633c0fb8a63440d8480f8b230"} Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.092105 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"cdc9b8e981520ea6029bf99f0f97e2f9b708402dce5aebc66a65b62351eae5dc"} Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.095411 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de80-account-create-v87xg" event={"ID":"cc55cc29-cee8-4470-a6f6-398ff183ba0b","Type":"ContainerStarted","Data":"d35329b89d79a80eed6ff3d7b05c974cc4c8c57b6604e0d1524c4e6c3bd5676f"} Sep 30 18:01:12 crc kubenswrapper[4797]: E0930 18:01:12.095943 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-f899g" podUID="fe16203f-60b5-483f-83b5-1d26b25292c9" Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.111467 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-g9tqz" podStartSLOduration=2.566085782 podStartE2EDuration="24.111453805s" podCreationTimestamp="2025-09-30 18:00:48 +0000 UTC" firstStartedPulling="2025-09-30 18:00:49.97431309 +0000 UTC m=+1100.496812328" lastFinishedPulling="2025-09-30 18:01:11.519681083 +0000 UTC m=+1122.042180351" observedRunningTime="2025-09-30 18:01:12.109356008 +0000 UTC m=+1122.631855246" watchObservedRunningTime="2025-09-30 18:01:12.111453805 +0000 UTC m=+1122.633953043" Sep 30 18:01:12 crc kubenswrapper[4797]: I0930 18:01:12.146940 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bbfa-account-create-gjc2s"] Sep 30 18:01:12 crc kubenswrapper[4797]: W0930 18:01:12.158775 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod499f35f7_af0b_4713_858e_ec4123da64a0.slice/crio-5e0c8460f05605eb9c9b5959ff6e5d4bf1870028ae1763fb7a11b1e5e9d58656 WatchSource:0}: Error finding container 5e0c8460f05605eb9c9b5959ff6e5d4bf1870028ae1763fb7a11b1e5e9d58656: Status 404 returned error can't find the container with id 5e0c8460f05605eb9c9b5959ff6e5d4bf1870028ae1763fb7a11b1e5e9d58656 Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.131956 4797 generic.go:334] "Generic (PLEG): container finished" podID="8f0627c8-4235-4c74-81eb-aef495551b9f" containerID="00b9cb38b97622f63cdb7a5d519afd3f6fe820c91a704879d41a4abb194269e0" exitCode=0 Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.132031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-932a-account-create-pq6wq" event={"ID":"8f0627c8-4235-4c74-81eb-aef495551b9f","Type":"ContainerDied","Data":"00b9cb38b97622f63cdb7a5d519afd3f6fe820c91a704879d41a4abb194269e0"} Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.133859 4797 generic.go:334] "Generic (PLEG): container finished" podID="cc55cc29-cee8-4470-a6f6-398ff183ba0b" containerID="902a7a18c877f848c42dc317847cb1ac3a746878bf1d1544040483c12ba8a9c1" exitCode=0 Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.134019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de80-account-create-v87xg" event={"ID":"cc55cc29-cee8-4470-a6f6-398ff183ba0b","Type":"ContainerDied","Data":"902a7a18c877f848c42dc317847cb1ac3a746878bf1d1544040483c12ba8a9c1"} Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.136055 4797 generic.go:334] "Generic (PLEG): container finished" podID="499f35f7-af0b-4713-858e-ec4123da64a0" containerID="ca1dab9996455a1209064e4c2671e4c855cd56451110a56a64ad5befbc571c59" exitCode=0 Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.136093 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bbfa-account-create-gjc2s" event={"ID":"499f35f7-af0b-4713-858e-ec4123da64a0","Type":"ContainerDied","Data":"ca1dab9996455a1209064e4c2671e4c855cd56451110a56a64ad5befbc571c59"} Sep 30 18:01:13 crc kubenswrapper[4797]: I0930 18:01:13.136111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bbfa-account-create-gjc2s" event={"ID":"499f35f7-af0b-4713-858e-ec4123da64a0","Type":"ContainerStarted","Data":"5e0c8460f05605eb9c9b5959ff6e5d4bf1870028ae1763fb7a11b1e5e9d58656"} Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.146603 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"f92516d745408e1870be9a6406044abfa81eddb3d1e9b9a41709a6874508580a"} Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.146903 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"f9b3b5f857090e9caa08a515d259429c2de0a4f57b9c9b537cdc50f689e1d1ea"} Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.146914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"cd48541a28d357c5113516c90e16ff1a3c39bef7907faa32bee9e5dcaf9dd3aa"} Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.202512 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.202570 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.478562 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.509766 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glvlc\" (UniqueName: \"kubernetes.io/projected/499f35f7-af0b-4713-858e-ec4123da64a0-kube-api-access-glvlc\") pod \"499f35f7-af0b-4713-858e-ec4123da64a0\" (UID: \"499f35f7-af0b-4713-858e-ec4123da64a0\") " Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.525376 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499f35f7-af0b-4713-858e-ec4123da64a0-kube-api-access-glvlc" (OuterVolumeSpecName: "kube-api-access-glvlc") pod "499f35f7-af0b-4713-858e-ec4123da64a0" (UID: "499f35f7-af0b-4713-858e-ec4123da64a0"). InnerVolumeSpecName "kube-api-access-glvlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.612141 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glvlc\" (UniqueName: \"kubernetes.io/projected/499f35f7-af0b-4713-858e-ec4123da64a0-kube-api-access-glvlc\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.679098 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.688359 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.713259 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z69nt\" (UniqueName: \"kubernetes.io/projected/8f0627c8-4235-4c74-81eb-aef495551b9f-kube-api-access-z69nt\") pod \"8f0627c8-4235-4c74-81eb-aef495551b9f\" (UID: \"8f0627c8-4235-4c74-81eb-aef495551b9f\") " Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.713376 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6lw7\" (UniqueName: \"kubernetes.io/projected/cc55cc29-cee8-4470-a6f6-398ff183ba0b-kube-api-access-q6lw7\") pod \"cc55cc29-cee8-4470-a6f6-398ff183ba0b\" (UID: \"cc55cc29-cee8-4470-a6f6-398ff183ba0b\") " Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.719003 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc55cc29-cee8-4470-a6f6-398ff183ba0b-kube-api-access-q6lw7" (OuterVolumeSpecName: "kube-api-access-q6lw7") pod "cc55cc29-cee8-4470-a6f6-398ff183ba0b" (UID: "cc55cc29-cee8-4470-a6f6-398ff183ba0b"). InnerVolumeSpecName "kube-api-access-q6lw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.736021 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0627c8-4235-4c74-81eb-aef495551b9f-kube-api-access-z69nt" (OuterVolumeSpecName: "kube-api-access-z69nt") pod "8f0627c8-4235-4c74-81eb-aef495551b9f" (UID: "8f0627c8-4235-4c74-81eb-aef495551b9f"). InnerVolumeSpecName "kube-api-access-z69nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.816092 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z69nt\" (UniqueName: \"kubernetes.io/projected/8f0627c8-4235-4c74-81eb-aef495551b9f-kube-api-access-z69nt\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:14 crc kubenswrapper[4797]: I0930 18:01:14.816126 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6lw7\" (UniqueName: \"kubernetes.io/projected/cc55cc29-cee8-4470-a6f6-398ff183ba0b-kube-api-access-q6lw7\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.171069 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-932a-account-create-pq6wq" event={"ID":"8f0627c8-4235-4c74-81eb-aef495551b9f","Type":"ContainerDied","Data":"28c58aa10a71e4796152e6c099f946af99da128b0117b316a536cfa5c2b3c282"} Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.171147 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28c58aa10a71e4796152e6c099f946af99da128b0117b316a536cfa5c2b3c282" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.171089 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-932a-account-create-pq6wq" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.173604 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de80-account-create-v87xg" event={"ID":"cc55cc29-cee8-4470-a6f6-398ff183ba0b","Type":"ContainerDied","Data":"d35329b89d79a80eed6ff3d7b05c974cc4c8c57b6604e0d1524c4e6c3bd5676f"} Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.173642 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de80-account-create-v87xg" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.173660 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d35329b89d79a80eed6ff3d7b05c974cc4c8c57b6604e0d1524c4e6c3bd5676f" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.176157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerStarted","Data":"5e3044933f061d0aba07cb9247c7818ce313a8ac44f853b909a494d2eb49c327"} Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.185289 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"10d7486a8f69c6f5dbc35d2d4ac599a75dd1bbdb69040878bf96d2042d93cbb3"} Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.187848 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bbfa-account-create-gjc2s" event={"ID":"499f35f7-af0b-4713-858e-ec4123da64a0","Type":"ContainerDied","Data":"5e0c8460f05605eb9c9b5959ff6e5d4bf1870028ae1763fb7a11b1e5e9d58656"} Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.187906 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0c8460f05605eb9c9b5959ff6e5d4bf1870028ae1763fb7a11b1e5e9d58656" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.187961 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bbfa-account-create-gjc2s" Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.190873 4797 generic.go:334] "Generic (PLEG): container finished" podID="d7f8ded7-5fab-43fb-8d0f-f514889b5640" containerID="67a7976a1409b78457acdbd2edd3f78545964324b8aee1c4d4ea1dd1782c8b34" exitCode=0 Sep 30 18:01:15 crc kubenswrapper[4797]: I0930 18:01:15.190914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-g9tqz" event={"ID":"d7f8ded7-5fab-43fb-8d0f-f514889b5640","Type":"ContainerDied","Data":"67a7976a1409b78457acdbd2edd3f78545964324b8aee1c4d4ea1dd1782c8b34"} Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.255858 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"5b3f58c52fd2e0239493a314ef27268fff1ee6b5a76018d174892a30baf549b3"} Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.256211 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"2efbba6d2be2bba78bf6fd3bff20081c97c57d95780c3660e6666ea6d27530ab"} Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.631959 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.771321 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-combined-ca-bundle\") pod \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.771629 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l544t\" (UniqueName: \"kubernetes.io/projected/d7f8ded7-5fab-43fb-8d0f-f514889b5640-kube-api-access-l544t\") pod \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.771744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-db-sync-config-data\") pod \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.771868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-config-data\") pod \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\" (UID: \"d7f8ded7-5fab-43fb-8d0f-f514889b5640\") " Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.776240 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d7f8ded7-5fab-43fb-8d0f-f514889b5640" (UID: "d7f8ded7-5fab-43fb-8d0f-f514889b5640"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.785762 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f8ded7-5fab-43fb-8d0f-f514889b5640-kube-api-access-l544t" (OuterVolumeSpecName: "kube-api-access-l544t") pod "d7f8ded7-5fab-43fb-8d0f-f514889b5640" (UID: "d7f8ded7-5fab-43fb-8d0f-f514889b5640"). InnerVolumeSpecName "kube-api-access-l544t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.814788 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7f8ded7-5fab-43fb-8d0f-f514889b5640" (UID: "d7f8ded7-5fab-43fb-8d0f-f514889b5640"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.851065 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-config-data" (OuterVolumeSpecName: "config-data") pod "d7f8ded7-5fab-43fb-8d0f-f514889b5640" (UID: "d7f8ded7-5fab-43fb-8d0f-f514889b5640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.873426 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.873806 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.873825 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l544t\" (UniqueName: \"kubernetes.io/projected/d7f8ded7-5fab-43fb-8d0f-f514889b5640-kube-api-access-l544t\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:16 crc kubenswrapper[4797]: I0930 18:01:16.873839 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7f8ded7-5fab-43fb-8d0f-f514889b5640-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.289120 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"8e6be2cbc9655c2eee6444a7b24e322a4fb01cc9c90719e5582efbfff9f68b74"} Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.289163 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"4f2caeafb0d43f7064804421fb195eb8219815d143cb19dbf3ac4bce37673a3d"} Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.289174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"c29c4a54aaa7cfd392f2a8347352ec0b62db1c55745082119fff0c065cc95c12"} Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.289183 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"733bb9e30d2b491b612ca4c04fd2b78ccd00c5d782a682189b2dea0ff59f61b3"} Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.291063 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-g9tqz" event={"ID":"d7f8ded7-5fab-43fb-8d0f-f514889b5640","Type":"ContainerDied","Data":"f5cff9c0a690f1fd3c6aacc68cde3c7368f62db061ec7166f2ab8c8d8f4710f8"} Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.291099 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5cff9c0a690f1fd3c6aacc68cde3c7368f62db061ec7166f2ab8c8d8f4710f8" Sep 30 18:01:17 crc kubenswrapper[4797]: I0930 18:01:17.291117 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-g9tqz" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.307154 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fecceec-298d-4979-b468-5fe35c9b68e7","Type":"ContainerStarted","Data":"3d4d4fc01efab5e704363aa3ba19e511dde1d4fb07ee0328e3d8790f07f4a1a4"} Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.347055 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.855533242 podStartE2EDuration="1m4.347038246s" podCreationTimestamp="2025-09-30 18:00:14 +0000 UTC" firstStartedPulling="2025-09-30 18:00:48.102909289 +0000 UTC m=+1098.625408527" lastFinishedPulling="2025-09-30 18:01:15.594414293 +0000 UTC m=+1126.116913531" observedRunningTime="2025-09-30 18:01:18.338390511 +0000 UTC m=+1128.860889779" watchObservedRunningTime="2025-09-30 18:01:18.347038246 +0000 UTC m=+1128.869537484" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.605274 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-zs7mh"] Sep 30 18:01:18 crc kubenswrapper[4797]: E0930 18:01:18.605609 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0627c8-4235-4c74-81eb-aef495551b9f" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.605626 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0627c8-4235-4c74-81eb-aef495551b9f" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: E0930 18:01:18.605648 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499f35f7-af0b-4713-858e-ec4123da64a0" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.605655 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="499f35f7-af0b-4713-858e-ec4123da64a0" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: E0930 18:01:18.605683 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f8ded7-5fab-43fb-8d0f-f514889b5640" containerName="watcher-db-sync" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.605689 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f8ded7-5fab-43fb-8d0f-f514889b5640" containerName="watcher-db-sync" Sep 30 18:01:18 crc kubenswrapper[4797]: E0930 18:01:18.605701 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc55cc29-cee8-4470-a6f6-398ff183ba0b" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.605707 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc55cc29-cee8-4470-a6f6-398ff183ba0b" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.606047 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f8ded7-5fab-43fb-8d0f-f514889b5640" containerName="watcher-db-sync" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.606066 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0627c8-4235-4c74-81eb-aef495551b9f" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.606075 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc55cc29-cee8-4470-a6f6-398ff183ba0b" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.606088 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="499f35f7-af0b-4713-858e-ec4123da64a0" containerName="mariadb-account-create" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.606927 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.609205 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.628195 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-zs7mh"] Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.706698 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/e9a35da8-9543-41cd-a16d-d2b16c3416f3-kube-api-access-6f826\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.706735 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.706772 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.706942 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.707142 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-config\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.707165 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.809155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/e9a35da8-9543-41cd-a16d-d2b16c3416f3-kube-api-access-6f826\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.809229 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.809298 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.809329 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.809403 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-config\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.809448 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.810285 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.810349 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.810378 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-config\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.810600 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.810752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.842048 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/e9a35da8-9543-41cd-a16d-d2b16c3416f3-kube-api-access-6f826\") pod \"dnsmasq-dns-77585f5f8c-zs7mh\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:18 crc kubenswrapper[4797]: I0930 18:01:18.938242 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:19 crc kubenswrapper[4797]: I0930 18:01:19.435933 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-zs7mh"] Sep 30 18:01:20 crc kubenswrapper[4797]: I0930 18:01:20.330526 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqh9l" event={"ID":"f34d2759-ae6e-43a3-8010-80596a570a37","Type":"ContainerStarted","Data":"ff921115c86c25979c1c63c0a2a2e99fbc994c692c94ee22efb61b0e7107cbcf"} Sep 30 18:01:20 crc kubenswrapper[4797]: I0930 18:01:20.341858 4797 generic.go:334] "Generic (PLEG): container finished" podID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerID="c0636382d894a9f2e7bcd36588dad82f62e56c92c0f862d34bfec80b7f993b92" exitCode=0 Sep 30 18:01:20 crc kubenswrapper[4797]: I0930 18:01:20.341915 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" event={"ID":"e9a35da8-9543-41cd-a16d-d2b16c3416f3","Type":"ContainerDied","Data":"c0636382d894a9f2e7bcd36588dad82f62e56c92c0f862d34bfec80b7f993b92"} Sep 30 18:01:20 crc kubenswrapper[4797]: I0930 18:01:20.341948 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" event={"ID":"e9a35da8-9543-41cd-a16d-d2b16c3416f3","Type":"ContainerStarted","Data":"29d64dd5a7a27b3cdab76a3d43569b05e8023d9699c33d574d126fda4368924a"} Sep 30 18:01:20 crc kubenswrapper[4797]: I0930 18:01:20.360166 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jqh9l" podStartSLOduration=1.936956194 podStartE2EDuration="31.360152234s" podCreationTimestamp="2025-09-30 18:00:49 +0000 UTC" firstStartedPulling="2025-09-30 18:00:50.336274139 +0000 UTC m=+1100.858773377" lastFinishedPulling="2025-09-30 18:01:19.759470189 +0000 UTC m=+1130.281969417" observedRunningTime="2025-09-30 18:01:20.359451375 +0000 UTC m=+1130.881950613" watchObservedRunningTime="2025-09-30 18:01:20.360152234 +0000 UTC m=+1130.882651462" Sep 30 18:01:21 crc kubenswrapper[4797]: I0930 18:01:21.351664 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" event={"ID":"e9a35da8-9543-41cd-a16d-d2b16c3416f3","Type":"ContainerStarted","Data":"e9ec585bfa02717b07e0456c7e87adec9af5d2521f2d2c31dd8a9648840d07fc"} Sep 30 18:01:21 crc kubenswrapper[4797]: I0930 18:01:21.351986 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:21 crc kubenswrapper[4797]: I0930 18:01:21.383835 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" podStartSLOduration=3.383817496 podStartE2EDuration="3.383817496s" podCreationTimestamp="2025-09-30 18:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:21.375877659 +0000 UTC m=+1131.898376957" watchObservedRunningTime="2025-09-30 18:01:21.383817496 +0000 UTC m=+1131.906316734" Sep 30 18:01:22 crc kubenswrapper[4797]: I0930 18:01:22.361282 4797 generic.go:334] "Generic (PLEG): container finished" podID="572d2f77-3315-4b90-860e-18d1973993ef" containerID="5e3044933f061d0aba07cb9247c7818ce313a8ac44f853b909a494d2eb49c327" exitCode=0 Sep 30 18:01:22 crc kubenswrapper[4797]: I0930 18:01:22.361371 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerDied","Data":"5e3044933f061d0aba07cb9247c7818ce313a8ac44f853b909a494d2eb49c327"} Sep 30 18:01:23 crc kubenswrapper[4797]: I0930 18:01:23.373166 4797 generic.go:334] "Generic (PLEG): container finished" podID="f34d2759-ae6e-43a3-8010-80596a570a37" containerID="ff921115c86c25979c1c63c0a2a2e99fbc994c692c94ee22efb61b0e7107cbcf" exitCode=0 Sep 30 18:01:23 crc kubenswrapper[4797]: I0930 18:01:23.373244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqh9l" event={"ID":"f34d2759-ae6e-43a3-8010-80596a570a37","Type":"ContainerDied","Data":"ff921115c86c25979c1c63c0a2a2e99fbc994c692c94ee22efb61b0e7107cbcf"} Sep 30 18:01:23 crc kubenswrapper[4797]: I0930 18:01:23.376023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerStarted","Data":"98b0a5b108c6563502107929a2d92851be5ca924b5ec9a9eb068abf5b67c0a0f"} Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.770937 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.827193 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bcww\" (UniqueName: \"kubernetes.io/projected/f34d2759-ae6e-43a3-8010-80596a570a37-kube-api-access-5bcww\") pod \"f34d2759-ae6e-43a3-8010-80596a570a37\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.827566 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-config-data\") pod \"f34d2759-ae6e-43a3-8010-80596a570a37\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.827606 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-combined-ca-bundle\") pod \"f34d2759-ae6e-43a3-8010-80596a570a37\" (UID: \"f34d2759-ae6e-43a3-8010-80596a570a37\") " Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.838488 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34d2759-ae6e-43a3-8010-80596a570a37-kube-api-access-5bcww" (OuterVolumeSpecName: "kube-api-access-5bcww") pod "f34d2759-ae6e-43a3-8010-80596a570a37" (UID: "f34d2759-ae6e-43a3-8010-80596a570a37"). InnerVolumeSpecName "kube-api-access-5bcww". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.857845 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f34d2759-ae6e-43a3-8010-80596a570a37" (UID: "f34d2759-ae6e-43a3-8010-80596a570a37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.878972 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-config-data" (OuterVolumeSpecName: "config-data") pod "f34d2759-ae6e-43a3-8010-80596a570a37" (UID: "f34d2759-ae6e-43a3-8010-80596a570a37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.929630 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bcww\" (UniqueName: \"kubernetes.io/projected/f34d2759-ae6e-43a3-8010-80596a570a37-kube-api-access-5bcww\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.929665 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:24 crc kubenswrapper[4797]: I0930 18:01:24.929674 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d2759-ae6e-43a3-8010-80596a570a37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.403183 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqh9l" event={"ID":"f34d2759-ae6e-43a3-8010-80596a570a37","Type":"ContainerDied","Data":"751411a8428c83d8ccba3c558a352b322228318778c9641738562b5ca637227d"} Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.403245 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751411a8428c83d8ccba3c558a352b322228318778c9641738562b5ca637227d" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.403330 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqh9l" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.405393 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f899g" event={"ID":"fe16203f-60b5-483f-83b5-1d26b25292c9","Type":"ContainerStarted","Data":"ee4184338f4202c69830b3b41d086b57863af8f284f4e6c7d49eac529777f16a"} Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.430176 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-f899g" podStartSLOduration=3.709885517 podStartE2EDuration="38.430161111s" podCreationTimestamp="2025-09-30 18:00:47 +0000 UTC" firstStartedPulling="2025-09-30 18:00:48.982899959 +0000 UTC m=+1099.505399197" lastFinishedPulling="2025-09-30 18:01:23.703175553 +0000 UTC m=+1134.225674791" observedRunningTime="2025-09-30 18:01:25.422743899 +0000 UTC m=+1135.945243137" watchObservedRunningTime="2025-09-30 18:01:25.430161111 +0000 UTC m=+1135.952660349" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.709428 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4j78k"] Sep 30 18:01:25 crc kubenswrapper[4797]: E0930 18:01:25.709917 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34d2759-ae6e-43a3-8010-80596a570a37" containerName="keystone-db-sync" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.709940 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34d2759-ae6e-43a3-8010-80596a570a37" containerName="keystone-db-sync" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.710153 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34d2759-ae6e-43a3-8010-80596a570a37" containerName="keystone-db-sync" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.710756 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.716655 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.716917 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.717015 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8tsjk" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.719179 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.737268 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4j78k"] Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.743314 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-config-data\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.743425 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78n25\" (UniqueName: \"kubernetes.io/projected/cb048627-f6eb-4487-8ba3-401d75c87743-kube-api-access-78n25\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.743492 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-credential-keys\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.743574 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-fernet-keys\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.743645 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-scripts\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.743683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-combined-ca-bundle\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.775538 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-zs7mh"] Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.776149 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerName="dnsmasq-dns" containerID="cri-o://e9ec585bfa02717b07e0456c7e87adec9af5d2521f2d2c31dd8a9648840d07fc" gracePeriod=10 Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.781703 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.854699 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-fernet-keys\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.854770 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-scripts\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.854797 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-combined-ca-bundle\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.854818 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-config-data\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.854858 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78n25\" (UniqueName: \"kubernetes.io/projected/cb048627-f6eb-4487-8ba3-401d75c87743-kube-api-access-78n25\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.854887 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-credential-keys\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.908361 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-fernet-keys\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.909084 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-config-data\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.914813 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-combined-ca-bundle\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.915088 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-scripts\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.918664 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-credential-keys\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.926345 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g2g4r"] Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.931815 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.964970 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-svc\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.965022 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.965058 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66k5\" (UniqueName: \"kubernetes.io/projected/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-kube-api-access-x66k5\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.965082 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.965164 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-config\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.965215 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:25 crc kubenswrapper[4797]: I0930 18:01:25.997364 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78n25\" (UniqueName: \"kubernetes.io/projected/cb048627-f6eb-4487-8ba3-401d75c87743-kube-api-access-78n25\") pod \"keystone-bootstrap-4j78k\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.035550 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g2g4r"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.058500 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.059674 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.073550 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.074592 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.076026 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66k5\" (UniqueName: \"kubernetes.io/projected/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-kube-api-access-x66k5\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.076082 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.076264 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-config\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.076369 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.076547 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-svc\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.076577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.077767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.078329 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-config\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.079109 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.080717 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-svc\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.080934 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.084779 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.100540 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.100776 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.100883 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-f2w2l" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.100966 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.105141 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.105139 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.120271 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.139169 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.159034 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66k5\" (UniqueName: \"kubernetes.io/projected/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-kube-api-access-x66k5\") pod \"dnsmasq-dns-55fff446b9-g2g4r\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.165417 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182544 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b612415-f31c-47ac-b931-08ed46b9cfaf-logs\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182637 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-config-data\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpc5r\" (UniqueName: \"kubernetes.io/projected/af798459-89f2-474d-9082-eee9e1712e86-kube-api-access-fpc5r\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182798 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af798459-89f2-474d-9082-eee9e1712e86-logs\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182827 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182898 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182935 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mq8\" (UniqueName: \"kubernetes.io/projected/9b612415-f31c-47ac-b931-08ed46b9cfaf-kube-api-access-85mq8\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182958 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-config-data\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.182984 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.183004 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.183065 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.183127 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2sgf\" (UniqueName: \"kubernetes.io/projected/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-kube-api-access-c2sgf\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.183171 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-config-data\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.183203 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-logs\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.198568 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.223994 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-758c546b97-2hd7n"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.225690 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.237837 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.238017 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.238152 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.238256 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ch4w5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.284868 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af798459-89f2-474d-9082-eee9e1712e86-logs\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.284920 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.284994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mq8\" (UniqueName: \"kubernetes.io/projected/9b612415-f31c-47ac-b931-08ed46b9cfaf-kube-api-access-85mq8\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-config-data\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285057 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285074 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285105 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285139 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-config-data\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285157 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2sgf\" (UniqueName: \"kubernetes.io/projected/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-kube-api-access-c2sgf\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285177 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-config-data\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285195 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-logs\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285218 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9qr\" (UniqueName: \"kubernetes.io/projected/4a08b307-3f62-4868-b20c-f43e337c5481-kube-api-access-7w9qr\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285263 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a08b307-3f62-4868-b20c-f43e337c5481-horizon-secret-key\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285282 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-scripts\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285302 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08b307-3f62-4868-b20c-f43e337c5481-logs\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285318 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b612415-f31c-47ac-b931-08ed46b9cfaf-logs\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285337 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-config-data\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.285358 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpc5r\" (UniqueName: \"kubernetes.io/projected/af798459-89f2-474d-9082-eee9e1712e86-kube-api-access-fpc5r\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.294241 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-logs\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.299302 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af798459-89f2-474d-9082-eee9e1712e86-logs\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.302242 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b612415-f31c-47ac-b931-08ed46b9cfaf-logs\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.317696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.326174 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-config-data\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.326617 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-config-data\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.333696 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.333773 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758c546b97-2hd7n"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.338423 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.348786 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.375062 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-config-data\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.382684 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.386163 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-config-data\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.386241 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9qr\" (UniqueName: \"kubernetes.io/projected/4a08b307-3f62-4868-b20c-f43e337c5481-kube-api-access-7w9qr\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.386286 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a08b307-3f62-4868-b20c-f43e337c5481-horizon-secret-key\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.386304 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-scripts\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.386335 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08b307-3f62-4868-b20c-f43e337c5481-logs\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.388099 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-config-data\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.419645 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08b307-3f62-4868-b20c-f43e337c5481-logs\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.444506 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-scripts\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.450854 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mq8\" (UniqueName: \"kubernetes.io/projected/9b612415-f31c-47ac-b931-08ed46b9cfaf-kube-api-access-85mq8\") pod \"watcher-api-0\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.452832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a08b307-3f62-4868-b20c-f43e337c5481-horizon-secret-key\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.453693 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2sgf\" (UniqueName: \"kubernetes.io/projected/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-kube-api-access-c2sgf\") pod \"watcher-applier-0\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.477022 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9qr\" (UniqueName: \"kubernetes.io/projected/4a08b307-3f62-4868-b20c-f43e337c5481-kube-api-access-7w9qr\") pod \"horizon-758c546b97-2hd7n\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.483666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpc5r\" (UniqueName: \"kubernetes.io/projected/af798459-89f2-474d-9082-eee9e1712e86-kube-api-access-fpc5r\") pod \"watcher-decision-engine-0\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.505021 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2zrb5"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.506248 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.509690 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.509738 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pwstb" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.510511 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.512029 4797 generic.go:334] "Generic (PLEG): container finished" podID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerID="e9ec585bfa02717b07e0456c7e87adec9af5d2521f2d2c31dd8a9648840d07fc" exitCode=0 Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.512082 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" event={"ID":"e9a35da8-9543-41cd-a16d-d2b16c3416f3","Type":"ContainerDied","Data":"e9ec585bfa02717b07e0456c7e87adec9af5d2521f2d2c31dd8a9648840d07fc"} Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.547041 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.550054 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerStarted","Data":"7fdaa5a582c2a250506cca3e78e660c432758d28c95f692db0c6bbbcc9027b68"} Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.559495 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vlvs2"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.560621 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.568055 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.568228 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.568323 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vrxmm" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.576262 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2zrb5"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.593540 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sm2p\" (UniqueName: \"kubernetes.io/projected/2deac399-0305-404a-bf66-cc9d4e122a3a-kube-api-access-2sm2p\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.593605 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-combined-ca-bundle\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.593646 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-config\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.601072 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.622326 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vlvs2"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.671901 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.672763 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-549f8c559c-fcmkl"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.674066 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.689735 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-549f8c559c-fcmkl"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695591 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-etc-machine-id\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695644 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-scripts\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695667 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sm2p\" (UniqueName: \"kubernetes.io/projected/2deac399-0305-404a-bf66-cc9d4e122a3a-kube-api-access-2sm2p\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695724 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6sbm\" (UniqueName: \"kubernetes.io/projected/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-kube-api-access-f6sbm\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695779 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-combined-ca-bundle\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695860 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-config-data\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695878 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-db-sync-config-data\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695907 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-config\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.695937 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-combined-ca-bundle\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.700646 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-combined-ca-bundle\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.703421 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-config\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.712159 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2q6s8"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.713659 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.720630 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.720780 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tpqh2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.720805 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.720977 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.721247 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g2g4r"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.746793 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sm2p\" (UniqueName: \"kubernetes.io/projected/2deac399-0305-404a-bf66-cc9d4e122a3a-kube-api-access-2sm2p\") pod \"neutron-db-sync-vlvs2\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.757822 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2q6s8"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.763340 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-g7g5h"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.764820 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.788538 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-g7g5h"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797004 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrkp\" (UniqueName: \"kubernetes.io/projected/e7b2a372-883f-4418-9939-6de336219cb8-kube-api-access-fqrkp\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797201 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-scripts\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62tcc\" (UniqueName: \"kubernetes.io/projected/5e952a92-5166-407d-814d-10160a0eb033-kube-api-access-62tcc\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797409 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-etc-machine-id\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797504 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-etc-machine-id\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-scripts\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797610 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b2a372-883f-4418-9939-6de336219cb8-logs\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797660 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e952a92-5166-407d-814d-10160a0eb033-logs\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797696 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-config-data\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797750 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e952a92-5166-407d-814d-10160a0eb033-horizon-secret-key\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797777 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-config-data\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797809 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6sbm\" (UniqueName: \"kubernetes.io/projected/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-kube-api-access-f6sbm\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.797869 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-scripts\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.798034 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-config-data\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.798103 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-db-sync-config-data\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.798176 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-combined-ca-bundle\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.798241 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-combined-ca-bundle\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.808022 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hvw66"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.809862 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.812179 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.844235 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-db-sync-config-data\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.853294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6sbm\" (UniqueName: \"kubernetes.io/projected/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-kube-api-access-f6sbm\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.853631 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-combined-ca-bundle\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.854605 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-config-data\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.855063 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6k5z6" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.892947 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.894009 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912022 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b2a372-883f-4418-9939-6de336219cb8-logs\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912118 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e952a92-5166-407d-814d-10160a0eb033-logs\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912153 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-config-data\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912197 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e952a92-5166-407d-814d-10160a0eb033-horizon-secret-key\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912233 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-config-data\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912287 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-scripts\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912421 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-combined-ca-bundle\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912478 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrkp\" (UniqueName: \"kubernetes.io/projected/e7b2a372-883f-4418-9939-6de336219cb8-kube-api-access-fqrkp\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912507 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-scripts\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912541 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62tcc\" (UniqueName: \"kubernetes.io/projected/5e952a92-5166-407d-814d-10160a0eb033-kube-api-access-62tcc\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912657 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e952a92-5166-407d-814d-10160a0eb033-logs\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.912928 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b2a372-883f-4418-9939-6de336219cb8-logs\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.915056 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.918174 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-scripts\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.918254 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.923182 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-config-data\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.926130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-scripts\") pod \"cinder-db-sync-2zrb5\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.928832 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.928935 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e952a92-5166-407d-814d-10160a0eb033-horizon-secret-key\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.929780 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-combined-ca-bundle\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.931390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-scripts\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.942512 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-config-data\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.962716 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=36.962697292 podStartE2EDuration="36.962697292s" podCreationTimestamp="2025-09-30 18:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:26.667190926 +0000 UTC m=+1137.189690164" watchObservedRunningTime="2025-09-30 18:01:26.962697292 +0000 UTC m=+1137.485196530" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.972212 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62tcc\" (UniqueName: \"kubernetes.io/projected/5e952a92-5166-407d-814d-10160a0eb033-kube-api-access-62tcc\") pod \"horizon-549f8c559c-fcmkl\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.986470 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrkp\" (UniqueName: \"kubernetes.io/projected/e7b2a372-883f-4418-9939-6de336219cb8-kube-api-access-fqrkp\") pod \"placement-db-sync-2q6s8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:26 crc kubenswrapper[4797]: I0930 18:01:26.989485 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hvw66"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.008795 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.038997 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039037 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbpnz\" (UniqueName: \"kubernetes.io/projected/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-kube-api-access-vbpnz\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039059 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-config\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039097 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49crk\" (UniqueName: \"kubernetes.io/projected/ae7c614c-eec0-400f-8862-4a19e74046da-kube-api-access-49crk\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039119 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-db-sync-config-data\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039149 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039177 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-combined-ca-bundle\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039223 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.039385 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.062010 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.076459 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2q6s8" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.145887 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-sb\") pod \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.145999 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-nb\") pod \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.146077 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-config\") pod \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.146103 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/e9a35da8-9543-41cd-a16d-d2b16c3416f3-kube-api-access-6f826\") pod \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.146121 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-swift-storage-0\") pod \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.146237 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-svc\") pod \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\" (UID: \"e9a35da8-9543-41cd-a16d-d2b16c3416f3\") " Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148199 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-config-data\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148578 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-scripts\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148617 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-run-httpd\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148667 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-log-httpd\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148853 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4lsr\" (UniqueName: \"kubernetes.io/projected/4b69b34b-4d04-4a75-86cc-62cc21727907-kube-api-access-w4lsr\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148885 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148903 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbpnz\" (UniqueName: \"kubernetes.io/projected/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-kube-api-access-vbpnz\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.148946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-config\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152419 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49crk\" (UniqueName: \"kubernetes.io/projected/ae7c614c-eec0-400f-8862-4a19e74046da-kube-api-access-49crk\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152495 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-db-sync-config-data\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152544 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-combined-ca-bundle\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152647 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152710 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152737 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.152802 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.153907 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.154109 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.154314 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-config\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.154855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.157116 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.162252 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a35da8-9543-41cd-a16d-d2b16c3416f3-kube-api-access-6f826" (OuterVolumeSpecName: "kube-api-access-6f826") pod "e9a35da8-9543-41cd-a16d-d2b16c3416f3" (UID: "e9a35da8-9543-41cd-a16d-d2b16c3416f3"). InnerVolumeSpecName "kube-api-access-6f826". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.168268 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.172358 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-combined-ca-bundle\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.172959 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-db-sync-config-data\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.192002 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49crk\" (UniqueName: \"kubernetes.io/projected/ae7c614c-eec0-400f-8862-4a19e74046da-kube-api-access-49crk\") pod \"dnsmasq-dns-76fcf4b695-g7g5h\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.193060 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbpnz\" (UniqueName: \"kubernetes.io/projected/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-kube-api-access-vbpnz\") pod \"barbican-db-sync-hvw66\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.226379 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e9a35da8-9543-41cd-a16d-d2b16c3416f3" (UID: "e9a35da8-9543-41cd-a16d-d2b16c3416f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.226576 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e9a35da8-9543-41cd-a16d-d2b16c3416f3" (UID: "e9a35da8-9543-41cd-a16d-d2b16c3416f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.252584 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e9a35da8-9543-41cd-a16d-d2b16c3416f3" (UID: "e9a35da8-9543-41cd-a16d-d2b16c3416f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.253411 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e9a35da8-9543-41cd-a16d-d2b16c3416f3" (UID: "e9a35da8-9543-41cd-a16d-d2b16c3416f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.253697 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-config" (OuterVolumeSpecName: "config") pod "e9a35da8-9543-41cd-a16d-d2b16c3416f3" (UID: "e9a35da8-9543-41cd-a16d-d2b16c3416f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254551 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254600 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-config-data\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254628 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-scripts\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-run-httpd\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254693 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-log-httpd\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.254762 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4lsr\" (UniqueName: \"kubernetes.io/projected/4b69b34b-4d04-4a75-86cc-62cc21727907-kube-api-access-w4lsr\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.256173 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.256199 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.256219 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.256230 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.256244 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/e9a35da8-9543-41cd-a16d-d2b16c3416f3-kube-api-access-6f826\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.256258 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9a35da8-9543-41cd-a16d-d2b16c3416f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.261036 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.261666 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-run-httpd\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.261888 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-log-httpd\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.264824 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-scripts\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.266952 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.290541 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-config-data\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.303106 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4lsr\" (UniqueName: \"kubernetes.io/projected/4b69b34b-4d04-4a75-86cc-62cc21727907-kube-api-access-w4lsr\") pod \"ceilometer-0\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.381091 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hvw66" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.405148 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.468928 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.499856 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4j78k"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.575072 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.580714 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerStarted","Data":"0e4af960f0a26fef96c5fd646c9bd99282c8a90827127f63d26c69c2bcc27795"} Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.584754 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g2g4r"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.592228 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4j78k" event={"ID":"cb048627-f6eb-4487-8ba3-401d75c87743","Type":"ContainerStarted","Data":"d99ee6bd3c2a7d06201a647654ca94e2fe2fbb06ca16e5ccd80270950b81a0ed"} Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.596281 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" event={"ID":"e9a35da8-9543-41cd-a16d-d2b16c3416f3","Type":"ContainerDied","Data":"29d64dd5a7a27b3cdab76a3d43569b05e8023d9699c33d574d126fda4368924a"} Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.596366 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-zs7mh" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.597618 4797 scope.go:117] "RemoveContainer" containerID="e9ec585bfa02717b07e0456c7e87adec9af5d2521f2d2c31dd8a9648840d07fc" Sep 30 18:01:27 crc kubenswrapper[4797]: W0930 18:01:27.624389 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf798459_89f2_474d_9082_eee9e1712e86.slice/crio-b3d0eab54a57c1274ae921c03b8cf7e47400a2cc6a729299a747dfcdba267d9b WatchSource:0}: Error finding container b3d0eab54a57c1274ae921c03b8cf7e47400a2cc6a729299a747dfcdba267d9b: Status 404 returned error can't find the container with id b3d0eab54a57c1274ae921c03b8cf7e47400a2cc6a729299a747dfcdba267d9b Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.645918 4797 scope.go:117] "RemoveContainer" containerID="c0636382d894a9f2e7bcd36588dad82f62e56c92c0f862d34bfec80b7f993b92" Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.736350 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-zs7mh"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.784596 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-zs7mh"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.884703 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.944318 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758c546b97-2hd7n"] Sep 30 18:01:27 crc kubenswrapper[4797]: I0930 18:01:27.964458 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.302357 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" path="/var/lib/kubelet/pods/e9a35da8-9543-41cd-a16d-d2b16c3416f3/volumes" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.314255 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2zrb5"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.335748 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-549f8c559c-fcmkl"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.437495 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2q6s8"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.489653 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vlvs2"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.517998 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.534336 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hvw66"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.559596 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-g7g5h"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.625481 4797 generic.go:334] "Generic (PLEG): container finished" podID="698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" containerID="02fa84bb3917e4cce949e91a0f5293ea494eff222c73160b784c2905156ee0f7" exitCode=0 Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.625663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" event={"ID":"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0","Type":"ContainerDied","Data":"02fa84bb3917e4cce949e91a0f5293ea494eff222c73160b784c2905156ee0f7"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.625699 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" event={"ID":"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0","Type":"ContainerStarted","Data":"e24a47615c7360c3ce9f054430ed2e657bf939f351a467bbe13b1fd66bdb8db0"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.663669 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-549f8c559c-fcmkl" event={"ID":"5e952a92-5166-407d-814d-10160a0eb033","Type":"ContainerStarted","Data":"81cf83307edf8bcbf82c5dca853613e608c668cba15319a5436c543d8c0b0d6c"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.674052 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4j78k" event={"ID":"cb048627-f6eb-4487-8ba3-401d75c87743","Type":"ContainerStarted","Data":"ea62b865d31ec338be9c647e612fd8d84155210dcdd889557093246b4daca098"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.679420 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerStarted","Data":"c6e437669840d6e719bf83c5a82adb532f146d92a20eb2743750bcad96694093"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.682585 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af798459-89f2-474d-9082-eee9e1712e86","Type":"ContainerStarted","Data":"b3d0eab54a57c1274ae921c03b8cf7e47400a2cc6a729299a747dfcdba267d9b"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.684389 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hvw66" event={"ID":"81ba253b-ce64-4926-a8d3-1c8dd9dfef16","Type":"ContainerStarted","Data":"adf512f3ac3be8dfb0649eeee84fc8479f6cd8039e16cd4298cdd41625d36576"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.686731 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2q6s8" event={"ID":"e7b2a372-883f-4418-9939-6de336219cb8","Type":"ContainerStarted","Data":"9be1ad00f28c67a4eeedb62ac0aa21f082d8ff400fdd6846a488a2795ca7913d"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.693247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758c546b97-2hd7n" event={"ID":"4a08b307-3f62-4868-b20c-f43e337c5481","Type":"ContainerStarted","Data":"787d4fc75ba033f5ec1eb11455603c645f4cad563e496609839bac77254da23d"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.695050 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlvs2" event={"ID":"2deac399-0305-404a-bf66-cc9d4e122a3a","Type":"ContainerStarted","Data":"2b1502cc709162c05e33c02d20ac8403820616d4fd26088af13d832a343b58c9"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.696101 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zrb5" event={"ID":"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844","Type":"ContainerStarted","Data":"72adee500a99840328dcb4d96d6fac93d3c5a92e61c8929bcaf22da7b9a39983"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.715475 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b612415-f31c-47ac-b931-08ed46b9cfaf","Type":"ContainerStarted","Data":"2c1b281620c6f953d650ed9a9d2e4f82b0bd604d75e8f72ebde3cac8949042c6"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.715515 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b612415-f31c-47ac-b931-08ed46b9cfaf","Type":"ContainerStarted","Data":"d5d2b06081f5f012737d9f106b3aee5e761c4aaf5b5b6c50ad211a410c6c9fe4"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.725844 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a","Type":"ContainerStarted","Data":"a211729f725d3d1b7a906984f85fee82253177dcffdd1ef20ebc92e9b750c786"} Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.862673 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4j78k" podStartSLOduration=3.862652602 podStartE2EDuration="3.862652602s" podCreationTimestamp="2025-09-30 18:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:28.703087246 +0000 UTC m=+1139.225586484" watchObservedRunningTime="2025-09-30 18:01:28.862652602 +0000 UTC m=+1139.385151840" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.876492 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.919327 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758c546b97-2hd7n"] Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.949650 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-644644d7d5-bvd5r"] Sep 30 18:01:28 crc kubenswrapper[4797]: E0930 18:01:28.950124 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerName="init" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.950147 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerName="init" Sep 30 18:01:28 crc kubenswrapper[4797]: E0930 18:01:28.950189 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerName="dnsmasq-dns" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.950198 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerName="dnsmasq-dns" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.950410 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9a35da8-9543-41cd-a16d-d2b16c3416f3" containerName="dnsmasq-dns" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.951623 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:28 crc kubenswrapper[4797]: I0930 18:01:28.966834 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644644d7d5-bvd5r"] Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.023874 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.030601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-config-data\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.030795 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14f99834-e13f-4ddd-a11c-41b7b3907b81-horizon-secret-key\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.030874 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-scripts\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.030958 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14f99834-e13f-4ddd-a11c-41b7b3907b81-logs\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.031091 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp5sm\" (UniqueName: \"kubernetes.io/projected/14f99834-e13f-4ddd-a11c-41b7b3907b81-kube-api-access-rp5sm\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.135474 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-config-data\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.135653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14f99834-e13f-4ddd-a11c-41b7b3907b81-horizon-secret-key\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.136664 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-scripts\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.136775 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14f99834-e13f-4ddd-a11c-41b7b3907b81-logs\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.136900 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp5sm\" (UniqueName: \"kubernetes.io/projected/14f99834-e13f-4ddd-a11c-41b7b3907b81-kube-api-access-rp5sm\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.139906 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-config-data\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.140190 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14f99834-e13f-4ddd-a11c-41b7b3907b81-logs\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.150829 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-scripts\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.153106 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14f99834-e13f-4ddd-a11c-41b7b3907b81-horizon-secret-key\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.158068 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp5sm\" (UniqueName: \"kubernetes.io/projected/14f99834-e13f-4ddd-a11c-41b7b3907b81-kube-api-access-rp5sm\") pod \"horizon-644644d7d5-bvd5r\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.377426 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.386912 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.443094 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66k5\" (UniqueName: \"kubernetes.io/projected/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-kube-api-access-x66k5\") pod \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.443203 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-svc\") pod \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.443287 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-config\") pod \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.443382 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-swift-storage-0\") pod \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.443409 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-nb\") pod \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.443461 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-sb\") pod \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\" (UID: \"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0\") " Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.450792 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-kube-api-access-x66k5" (OuterVolumeSpecName: "kube-api-access-x66k5") pod "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" (UID: "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0"). InnerVolumeSpecName "kube-api-access-x66k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.468148 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-config" (OuterVolumeSpecName: "config") pod "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" (UID: "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.469153 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" (UID: "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.475382 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" (UID: "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.494975 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" (UID: "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.498548 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" (UID: "698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.547066 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.547112 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.547125 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x66k5\" (UniqueName: \"kubernetes.io/projected/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-kube-api-access-x66k5\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.547142 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.547503 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.547526 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.767046 4797 generic.go:334] "Generic (PLEG): container finished" podID="ae7c614c-eec0-400f-8862-4a19e74046da" containerID="e68a2e1054c5bf8d7d0a2bff511150b45276b6cf2e251155dffb6ca5004c9594" exitCode=0 Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.767331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" event={"ID":"ae7c614c-eec0-400f-8862-4a19e74046da","Type":"ContainerDied","Data":"e68a2e1054c5bf8d7d0a2bff511150b45276b6cf2e251155dffb6ca5004c9594"} Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.767356 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" event={"ID":"ae7c614c-eec0-400f-8862-4a19e74046da","Type":"ContainerStarted","Data":"52734fa51073cad7cb3af688466539291f963fc561eadb5a20aa2204c2ff635e"} Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.771093 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" event={"ID":"698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0","Type":"ContainerDied","Data":"e24a47615c7360c3ce9f054430ed2e657bf939f351a467bbe13b1fd66bdb8db0"} Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.771127 4797 scope.go:117] "RemoveContainer" containerID="02fa84bb3917e4cce949e91a0f5293ea494eff222c73160b784c2905156ee0f7" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.771208 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g2g4r" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.782157 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlvs2" event={"ID":"2deac399-0305-404a-bf66-cc9d4e122a3a","Type":"ContainerStarted","Data":"3af9863f921ac19ad985e6355e638a2adad2f8f630db7b65158976422722ce09"} Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.821514 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api-log" containerID="cri-o://2c1b281620c6f953d650ed9a9d2e4f82b0bd604d75e8f72ebde3cac8949042c6" gracePeriod=30 Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.821798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b612415-f31c-47ac-b931-08ed46b9cfaf","Type":"ContainerStarted","Data":"1a4df30479adbfbac9241512851b9d7d3e8e0b517821f5d148d329a156313f8f"} Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.821839 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.821912 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" containerID="cri-o://1a4df30479adbfbac9241512851b9d7d3e8e0b517821f5d148d329a156313f8f" gracePeriod=30 Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.834934 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vlvs2" podStartSLOduration=3.83491299 podStartE2EDuration="3.83491299s" podCreationTimestamp="2025-09-30 18:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:29.820967599 +0000 UTC m=+1140.343466837" watchObservedRunningTime="2025-09-30 18:01:29.83491299 +0000 UTC m=+1140.357412228" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.840198 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": EOF" Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.874410 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g2g4r"] Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.882873 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g2g4r"] Sep 30 18:01:29 crc kubenswrapper[4797]: I0930 18:01:29.914273 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.914250835 podStartE2EDuration="4.914250835s" podCreationTimestamp="2025-09-30 18:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:29.893678364 +0000 UTC m=+1140.416177602" watchObservedRunningTime="2025-09-30 18:01:29.914250835 +0000 UTC m=+1140.436750073" Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.022322 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644644d7d5-bvd5r"] Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.281875 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" path="/var/lib/kubelet/pods/698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0/volumes" Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.838699 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" event={"ID":"ae7c614c-eec0-400f-8862-4a19e74046da","Type":"ContainerStarted","Data":"1bac15db7d92419dbf76c3bfb7311c2a869872823af2abfda90a9d591c86c9fd"} Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.842383 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.856810 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644644d7d5-bvd5r" event={"ID":"14f99834-e13f-4ddd-a11c-41b7b3907b81","Type":"ContainerStarted","Data":"b44bcb13feb91b76dd1cbc429c867ff0429d18ce6a00f081de4f5afdcfbd1104"} Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.864283 4797 generic.go:334] "Generic (PLEG): container finished" podID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerID="2c1b281620c6f953d650ed9a9d2e4f82b0bd604d75e8f72ebde3cac8949042c6" exitCode=143 Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.864402 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b612415-f31c-47ac-b931-08ed46b9cfaf","Type":"ContainerDied","Data":"2c1b281620c6f953d650ed9a9d2e4f82b0bd604d75e8f72ebde3cac8949042c6"} Sep 30 18:01:30 crc kubenswrapper[4797]: I0930 18:01:30.871527 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" podStartSLOduration=4.871509674 podStartE2EDuration="4.871509674s" podCreationTimestamp="2025-09-30 18:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:30.866390934 +0000 UTC m=+1141.388890172" watchObservedRunningTime="2025-09-30 18:01:30.871509674 +0000 UTC m=+1141.394008922" Sep 30 18:01:31 crc kubenswrapper[4797]: I0930 18:01:31.229092 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 18:01:31 crc kubenswrapper[4797]: I0930 18:01:31.673326 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.044013 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-549f8c559c-fcmkl"] Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.074308 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-ff7898f76-hfsxf"] Sep 30 18:01:35 crc kubenswrapper[4797]: E0930 18:01:35.074712 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" containerName="init" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.074731 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" containerName="init" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.074887 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="698c7673-5b3a-4ff7-a5d6-2d596d0ec4c0" containerName="init" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.075774 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.078725 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.122884 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ff7898f76-hfsxf"] Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170397 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e43627c6-a815-4487-b13d-ff9a402fa860-logs\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170519 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzqx\" (UniqueName: \"kubernetes.io/projected/e43627c6-a815-4487-b13d-ff9a402fa860-kube-api-access-svzqx\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-tls-certs\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170620 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-secret-key\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170643 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-scripts\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170690 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-combined-ca-bundle\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.170741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-config-data\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.184058 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-644644d7d5-bvd5r"] Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.210822 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6676d4ddcd-sxf6l"] Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.212949 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.222033 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6676d4ddcd-sxf6l"] Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svzqx\" (UniqueName: \"kubernetes.io/projected/e43627c6-a815-4487-b13d-ff9a402fa860-kube-api-access-svzqx\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272475 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-tls-certs\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272500 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04e30fb7-7876-4a90-b887-05b7da2f7746-scripts\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272519 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e30fb7-7876-4a90-b887-05b7da2f7746-logs\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272545 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-horizon-secret-key\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272580 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04e30fb7-7876-4a90-b887-05b7da2f7746-config-data\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272630 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-secret-key\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272648 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-scripts\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272697 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-combined-ca-bundle\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272719 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-combined-ca-bundle\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272749 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-horizon-tls-certs\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-config-data\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272842 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbhmm\" (UniqueName: \"kubernetes.io/projected/04e30fb7-7876-4a90-b887-05b7da2f7746-kube-api-access-jbhmm\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.272885 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e43627c6-a815-4487-b13d-ff9a402fa860-logs\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.273445 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e43627c6-a815-4487-b13d-ff9a402fa860-logs\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.274721 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-config-data\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.275028 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-scripts\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.278798 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-combined-ca-bundle\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.286176 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-tls-certs\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.291962 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-secret-key\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.295998 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzqx\" (UniqueName: \"kubernetes.io/projected/e43627c6-a815-4487-b13d-ff9a402fa860-kube-api-access-svzqx\") pod \"horizon-ff7898f76-hfsxf\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.373939 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04e30fb7-7876-4a90-b887-05b7da2f7746-scripts\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.374517 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e30fb7-7876-4a90-b887-05b7da2f7746-logs\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.374633 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-horizon-secret-key\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.374744 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04e30fb7-7876-4a90-b887-05b7da2f7746-config-data\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.374866 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-combined-ca-bundle\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.374954 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-horizon-tls-certs\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.375075 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbhmm\" (UniqueName: \"kubernetes.io/projected/04e30fb7-7876-4a90-b887-05b7da2f7746-kube-api-access-jbhmm\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.375093 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04e30fb7-7876-4a90-b887-05b7da2f7746-scripts\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.376106 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e30fb7-7876-4a90-b887-05b7da2f7746-logs\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.376289 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04e30fb7-7876-4a90-b887-05b7da2f7746-config-data\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.385775 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-horizon-secret-key\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.386348 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-horizon-tls-certs\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.386840 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e30fb7-7876-4a90-b887-05b7da2f7746-combined-ca-bundle\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.394599 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbhmm\" (UniqueName: \"kubernetes.io/projected/04e30fb7-7876-4a90-b887-05b7da2f7746-kube-api-access-jbhmm\") pod \"horizon-6676d4ddcd-sxf6l\" (UID: \"04e30fb7-7876-4a90-b887-05b7da2f7746\") " pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.409047 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:01:35 crc kubenswrapper[4797]: I0930 18:01:35.581363 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:01:36 crc kubenswrapper[4797]: I0930 18:01:36.228985 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 18:01:36 crc kubenswrapper[4797]: I0930 18:01:36.252019 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 18:01:36 crc kubenswrapper[4797]: I0930 18:01:36.317190 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": read tcp 10.217.0.2:36502->10.217.0.151:9322: read: connection reset by peer" Sep 30 18:01:36 crc kubenswrapper[4797]: I0930 18:01:36.673131 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": dial tcp 10.217.0.151:9322: connect: connection refused" Sep 30 18:01:36 crc kubenswrapper[4797]: I0930 18:01:36.927708 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 18:01:37 crc kubenswrapper[4797]: I0930 18:01:37.470562 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:01:37 crc kubenswrapper[4797]: I0930 18:01:37.523881 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4hzw4"] Sep 30 18:01:37 crc kubenswrapper[4797]: I0930 18:01:37.524104 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" containerID="cri-o://e70e34b1e2f2abdb7afeaa78eb3df112af3e91dc117ecf5a8078b286412a83b4" gracePeriod=10 Sep 30 18:01:37 crc kubenswrapper[4797]: E0930 18:01:37.975822 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Sep 30 18:01:38 crc kubenswrapper[4797]: I0930 18:01:38.972633 4797 generic.go:334] "Generic (PLEG): container finished" podID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerID="1a4df30479adbfbac9241512851b9d7d3e8e0b517821f5d148d329a156313f8f" exitCode=0 Sep 30 18:01:38 crc kubenswrapper[4797]: I0930 18:01:38.972707 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b612415-f31c-47ac-b931-08ed46b9cfaf","Type":"ContainerDied","Data":"1a4df30479adbfbac9241512851b9d7d3e8e0b517821f5d148d329a156313f8f"} Sep 30 18:01:38 crc kubenswrapper[4797]: I0930 18:01:38.978722 4797 generic.go:334] "Generic (PLEG): container finished" podID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerID="e70e34b1e2f2abdb7afeaa78eb3df112af3e91dc117ecf5a8078b286412a83b4" exitCode=0 Sep 30 18:01:38 crc kubenswrapper[4797]: I0930 18:01:38.978749 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4hzw4" event={"ID":"4d5a74b4-24b3-4369-bda9-4de7e98d9821","Type":"ContainerDied","Data":"e70e34b1e2f2abdb7afeaa78eb3df112af3e91dc117ecf5a8078b286412a83b4"} Sep 30 18:01:39 crc kubenswrapper[4797]: I0930 18:01:39.547422 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Sep 30 18:01:41 crc kubenswrapper[4797]: I0930 18:01:41.018005 4797 generic.go:334] "Generic (PLEG): container finished" podID="cb048627-f6eb-4487-8ba3-401d75c87743" containerID="ea62b865d31ec338be9c647e612fd8d84155210dcdd889557093246b4daca098" exitCode=0 Sep 30 18:01:41 crc kubenswrapper[4797]: I0930 18:01:41.018010 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4j78k" event={"ID":"cb048627-f6eb-4487-8ba3-401d75c87743","Type":"ContainerDied","Data":"ea62b865d31ec338be9c647e612fd8d84155210dcdd889557093246b4daca098"} Sep 30 18:01:43 crc kubenswrapper[4797]: E0930 18:01:43.382128 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Sep 30 18:01:43 crc kubenswrapper[4797]: E0930 18:01:43.382975 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqrkp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2q6s8_openstack(e7b2a372-883f-4418-9939-6de336219cb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:01:43 crc kubenswrapper[4797]: E0930 18:01:43.384265 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2q6s8" podUID="e7b2a372-883f-4418-9939-6de336219cb8" Sep 30 18:01:44 crc kubenswrapper[4797]: E0930 18:01:44.042788 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2q6s8" podUID="e7b2a372-883f-4418-9939-6de336219cb8" Sep 30 18:01:44 crc kubenswrapper[4797]: I0930 18:01:44.192099 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:01:44 crc kubenswrapper[4797]: I0930 18:01:44.192178 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:01:44 crc kubenswrapper[4797]: I0930 18:01:44.192244 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:01:44 crc kubenswrapper[4797]: I0930 18:01:44.193339 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f48f1c21375195dc21057b3f6223be6922c12af920895f2ea187fde9415ae1df"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:01:44 crc kubenswrapper[4797]: I0930 18:01:44.193473 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://f48f1c21375195dc21057b3f6223be6922c12af920895f2ea187fde9415ae1df" gracePeriod=600 Sep 30 18:01:45 crc kubenswrapper[4797]: I0930 18:01:45.054345 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="f48f1c21375195dc21057b3f6223be6922c12af920895f2ea187fde9415ae1df" exitCode=0 Sep 30 18:01:45 crc kubenswrapper[4797]: I0930 18:01:45.054390 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"f48f1c21375195dc21057b3f6223be6922c12af920895f2ea187fde9415ae1df"} Sep 30 18:01:45 crc kubenswrapper[4797]: I0930 18:01:45.054428 4797 scope.go:117] "RemoveContainer" containerID="24310c137eb65af07a384098fa62de96f749a8ebba9db197c7de2ab1bee41304" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.674487 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.683919 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.752813 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-config-data\") pod \"9b612415-f31c-47ac-b931-08ed46b9cfaf\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.752872 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mq8\" (UniqueName: \"kubernetes.io/projected/9b612415-f31c-47ac-b931-08ed46b9cfaf-kube-api-access-85mq8\") pod \"9b612415-f31c-47ac-b931-08ed46b9cfaf\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.753051 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-custom-prometheus-ca\") pod \"9b612415-f31c-47ac-b931-08ed46b9cfaf\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.753145 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-combined-ca-bundle\") pod \"9b612415-f31c-47ac-b931-08ed46b9cfaf\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.753176 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b612415-f31c-47ac-b931-08ed46b9cfaf-logs\") pod \"9b612415-f31c-47ac-b931-08ed46b9cfaf\" (UID: \"9b612415-f31c-47ac-b931-08ed46b9cfaf\") " Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.754016 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b612415-f31c-47ac-b931-08ed46b9cfaf-logs" (OuterVolumeSpecName: "logs") pod "9b612415-f31c-47ac-b931-08ed46b9cfaf" (UID: "9b612415-f31c-47ac-b931-08ed46b9cfaf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.763731 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b612415-f31c-47ac-b931-08ed46b9cfaf-kube-api-access-85mq8" (OuterVolumeSpecName: "kube-api-access-85mq8") pod "9b612415-f31c-47ac-b931-08ed46b9cfaf" (UID: "9b612415-f31c-47ac-b931-08ed46b9cfaf"). InnerVolumeSpecName "kube-api-access-85mq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.853663 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b612415-f31c-47ac-b931-08ed46b9cfaf" (UID: "9b612415-f31c-47ac-b931-08ed46b9cfaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.854753 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.854776 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b612415-f31c-47ac-b931-08ed46b9cfaf-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.854786 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mq8\" (UniqueName: \"kubernetes.io/projected/9b612415-f31c-47ac-b931-08ed46b9cfaf-kube-api-access-85mq8\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.860827 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9b612415-f31c-47ac-b931-08ed46b9cfaf" (UID: "9b612415-f31c-47ac-b931-08ed46b9cfaf"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.956825 4797 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:46 crc kubenswrapper[4797]: I0930 18:01:46.970922 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-config-data" (OuterVolumeSpecName: "config-data") pod "9b612415-f31c-47ac-b931-08ed46b9cfaf" (UID: "9b612415-f31c-47ac-b931-08ed46b9cfaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.058714 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b612415-f31c-47ac-b931-08ed46b9cfaf-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.076694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b612415-f31c-47ac-b931-08ed46b9cfaf","Type":"ContainerDied","Data":"d5d2b06081f5f012737d9f106b3aee5e761c4aaf5b5b6c50ad211a410c6c9fe4"} Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.076777 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.109542 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.125791 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.142884 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:47 crc kubenswrapper[4797]: E0930 18:01:47.143360 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api-log" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.143377 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api-log" Sep 30 18:01:47 crc kubenswrapper[4797]: E0930 18:01:47.143397 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.143405 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.143668 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.143690 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api-log" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.146030 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.155079 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.155718 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.162225 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.162301 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4daf73b-c17d-4879-b82a-c7eca9bafbab-logs\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.162368 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.162411 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvf8\" (UniqueName: \"kubernetes.io/projected/b4daf73b-c17d-4879-b82a-c7eca9bafbab-kube-api-access-kbvf8\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.162456 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-config-data\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.264479 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.264552 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvf8\" (UniqueName: \"kubernetes.io/projected/b4daf73b-c17d-4879-b82a-c7eca9bafbab-kube-api-access-kbvf8\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.264581 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-config-data\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.264639 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.264735 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4daf73b-c17d-4879-b82a-c7eca9bafbab-logs\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.265187 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4daf73b-c17d-4879-b82a-c7eca9bafbab-logs\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.274512 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.275033 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.275224 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-config-data\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.283807 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvf8\" (UniqueName: \"kubernetes.io/projected/b4daf73b-c17d-4879-b82a-c7eca9bafbab-kube-api-access-kbvf8\") pod \"watcher-api-0\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " pod="openstack/watcher-api-0" Sep 30 18:01:47 crc kubenswrapper[4797]: I0930 18:01:47.469767 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:01:48 crc kubenswrapper[4797]: I0930 18:01:48.267033 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" path="/var/lib/kubelet/pods/9b612415-f31c-47ac-b931-08ed46b9cfaf/volumes" Sep 30 18:01:49 crc kubenswrapper[4797]: I0930 18:01:49.092739 4797 generic.go:334] "Generic (PLEG): container finished" podID="fe16203f-60b5-483f-83b5-1d26b25292c9" containerID="ee4184338f4202c69830b3b41d086b57863af8f284f4e6c7d49eac529777f16a" exitCode=0 Sep 30 18:01:49 crc kubenswrapper[4797]: I0930 18:01:49.093086 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f899g" event={"ID":"fe16203f-60b5-483f-83b5-1d26b25292c9","Type":"ContainerDied","Data":"ee4184338f4202c69830b3b41d086b57863af8f284f4e6c7d49eac529777f16a"} Sep 30 18:01:49 crc kubenswrapper[4797]: I0930 18:01:49.547629 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Sep 30 18:01:51 crc kubenswrapper[4797]: I0930 18:01:51.675309 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b612415-f31c-47ac-b931-08ed46b9cfaf" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:01:53 crc kubenswrapper[4797]: E0930 18:01:53.706516 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 18:01:53 crc kubenswrapper[4797]: E0930 18:01:53.707386 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67h697h65h589h57ch56dh89h5d8h55h7h674h5d6h547hdch5dfh5fbh654h89h57fh66fh5bbh6bh644h586h56ch69h88hc5h9dh85h554h66fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rp5sm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-644644d7d5-bvd5r_openstack(14f99834-e13f-4ddd-a11c-41b7b3907b81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:01:53 crc kubenswrapper[4797]: E0930 18:01:53.721039 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-644644d7d5-bvd5r" podUID="14f99834-e13f-4ddd-a11c-41b7b3907b81" Sep 30 18:01:53 crc kubenswrapper[4797]: E0930 18:01:53.731511 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 18:01:53 crc kubenswrapper[4797]: E0930 18:01:53.731837 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647h568h599h654h66bh89h56dh76hf7h64fh5d9h68bh5c6h67hb8h5cbhf4h8ch555h56h5c5h654h576h55fh9fhbch67fh554hc5h8h5bdhf5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-62tcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-549f8c559c-fcmkl_openstack(5e952a92-5166-407d-814d-10160a0eb033): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:01:53 crc kubenswrapper[4797]: E0930 18:01:53.738516 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-549f8c559c-fcmkl" podUID="5e952a92-5166-407d-814d-10160a0eb033" Sep 30 18:01:54 crc kubenswrapper[4797]: I0930 18:01:54.548634 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Sep 30 18:01:54 crc kubenswrapper[4797]: I0930 18:01:54.549127 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:01:59 crc kubenswrapper[4797]: I0930 18:01:59.549085 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Sep 30 18:02:00 crc kubenswrapper[4797]: I0930 18:02:00.201936 4797 generic.go:334] "Generic (PLEG): container finished" podID="2deac399-0305-404a-bf66-cc9d4e122a3a" containerID="3af9863f921ac19ad985e6355e638a2adad2f8f630db7b65158976422722ce09" exitCode=0 Sep 30 18:02:00 crc kubenswrapper[4797]: I0930 18:02:00.202039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlvs2" event={"ID":"2deac399-0305-404a-bf66-cc9d4e122a3a","Type":"ContainerDied","Data":"3af9863f921ac19ad985e6355e638a2adad2f8f630db7b65158976422722ce09"} Sep 30 18:02:01 crc kubenswrapper[4797]: E0930 18:02:01.303620 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 18:02:01 crc kubenswrapper[4797]: E0930 18:02:01.305280 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n598hf7h5c6hdfh679h669hd4hf6h57ch65ch5dh65dh54chf7h585h5fch5ddh54fh549hdfh68fh648h697h5d8h5ffhb9h8chb7h688h659h67ch675q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7w9qr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-758c546b97-2hd7n_openstack(4a08b307-3f62-4868-b20c-f43e337c5481): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:02:01 crc kubenswrapper[4797]: E0930 18:02:01.310700 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-758c546b97-2hd7n" podUID="4a08b307-3f62-4868-b20c-f43e337c5481" Sep 30 18:02:01 crc kubenswrapper[4797]: E0930 18:02:01.868851 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 30 18:02:01 crc kubenswrapper[4797]: E0930 18:02:01.869082 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vbpnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hvw66_openstack(81ba253b-ce64-4926-a8d3-1c8dd9dfef16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:02:01 crc kubenswrapper[4797]: E0930 18:02:01.871823 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hvw66" podUID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" Sep 30 18:02:01 crc kubenswrapper[4797]: I0930 18:02:01.979730 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:02:01 crc kubenswrapper[4797]: I0930 18:02:01.992675 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.159730 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmtp9\" (UniqueName: \"kubernetes.io/projected/4d5a74b4-24b3-4369-bda9-4de7e98d9821-kube-api-access-nmtp9\") pod \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.159801 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-dns-svc\") pod \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.159875 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-combined-ca-bundle\") pod \"cb048627-f6eb-4487-8ba3-401d75c87743\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.159918 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-nb\") pod \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.159967 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-config-data\") pod \"cb048627-f6eb-4487-8ba3-401d75c87743\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.159993 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-sb\") pod \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.160043 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-scripts\") pod \"cb048627-f6eb-4487-8ba3-401d75c87743\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.160112 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-fernet-keys\") pod \"cb048627-f6eb-4487-8ba3-401d75c87743\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.160161 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-credential-keys\") pod \"cb048627-f6eb-4487-8ba3-401d75c87743\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.160199 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-config\") pod \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\" (UID: \"4d5a74b4-24b3-4369-bda9-4de7e98d9821\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.160265 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78n25\" (UniqueName: \"kubernetes.io/projected/cb048627-f6eb-4487-8ba3-401d75c87743-kube-api-access-78n25\") pod \"cb048627-f6eb-4487-8ba3-401d75c87743\" (UID: \"cb048627-f6eb-4487-8ba3-401d75c87743\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.179537 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cb048627-f6eb-4487-8ba3-401d75c87743" (UID: "cb048627-f6eb-4487-8ba3-401d75c87743"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.179752 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cb048627-f6eb-4487-8ba3-401d75c87743" (UID: "cb048627-f6eb-4487-8ba3-401d75c87743"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.179853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5a74b4-24b3-4369-bda9-4de7e98d9821-kube-api-access-nmtp9" (OuterVolumeSpecName: "kube-api-access-nmtp9") pod "4d5a74b4-24b3-4369-bda9-4de7e98d9821" (UID: "4d5a74b4-24b3-4369-bda9-4de7e98d9821"). InnerVolumeSpecName "kube-api-access-nmtp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.180768 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-scripts" (OuterVolumeSpecName: "scripts") pod "cb048627-f6eb-4487-8ba3-401d75c87743" (UID: "cb048627-f6eb-4487-8ba3-401d75c87743"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.181791 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb048627-f6eb-4487-8ba3-401d75c87743-kube-api-access-78n25" (OuterVolumeSpecName: "kube-api-access-78n25") pod "cb048627-f6eb-4487-8ba3-401d75c87743" (UID: "cb048627-f6eb-4487-8ba3-401d75c87743"). InnerVolumeSpecName "kube-api-access-78n25". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.195117 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-config-data" (OuterVolumeSpecName: "config-data") pod "cb048627-f6eb-4487-8ba3-401d75c87743" (UID: "cb048627-f6eb-4487-8ba3-401d75c87743"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.195779 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb048627-f6eb-4487-8ba3-401d75c87743" (UID: "cb048627-f6eb-4487-8ba3-401d75c87743"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.214253 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-config" (OuterVolumeSpecName: "config") pod "4d5a74b4-24b3-4369-bda9-4de7e98d9821" (UID: "4d5a74b4-24b3-4369-bda9-4de7e98d9821"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.228285 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d5a74b4-24b3-4369-bda9-4de7e98d9821" (UID: "4d5a74b4-24b3-4369-bda9-4de7e98d9821"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.237511 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d5a74b4-24b3-4369-bda9-4de7e98d9821" (UID: "4d5a74b4-24b3-4369-bda9-4de7e98d9821"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.238463 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4j78k" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.239303 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d5a74b4-24b3-4369-bda9-4de7e98d9821" (UID: "4d5a74b4-24b3-4369-bda9-4de7e98d9821"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.241693 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4hzw4" Sep 30 18:02:02 crc kubenswrapper[4797]: E0930 18:02:02.243658 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hvw66" podUID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.269954 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.269981 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.269991 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78n25\" (UniqueName: \"kubernetes.io/projected/cb048627-f6eb-4487-8ba3-401d75c87743-kube-api-access-78n25\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270000 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmtp9\" (UniqueName: \"kubernetes.io/projected/4d5a74b4-24b3-4369-bda9-4de7e98d9821-kube-api-access-nmtp9\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270007 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270015 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270023 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270031 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270039 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a74b4-24b3-4369-bda9-4de7e98d9821-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270046 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.270053 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb048627-f6eb-4487-8ba3-401d75c87743-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.289577 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4j78k" event={"ID":"cb048627-f6eb-4487-8ba3-401d75c87743","Type":"ContainerDied","Data":"d99ee6bd3c2a7d06201a647654ca94e2fe2fbb06ca16e5ccd80270950b81a0ed"} Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.289622 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d99ee6bd3c2a7d06201a647654ca94e2fe2fbb06ca16e5ccd80270950b81a0ed" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.289631 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4hzw4" event={"ID":"4d5a74b4-24b3-4369-bda9-4de7e98d9821","Type":"ContainerDied","Data":"6e1d642a8fca3e1d5b1b2d731beb4b4795762d544ff3a635b8273f0f32a1e2bb"} Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.344118 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4hzw4"] Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.350819 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4hzw4"] Sep 30 18:02:02 crc kubenswrapper[4797]: E0930 18:02:02.495475 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Sep 30 18:02:02 crc kubenswrapper[4797]: E0930 18:02:02.495676 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf8h86h68dh5c7hdfh556h5f4h6fhd7h675h5bfh65ch5d8h668h5dh9dh5d8hb8hcch57ch64fh67dh666h5c5h577h657h5d8h56h88h94hd5h575q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4lsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4b69b34b-4d04-4a75-86cc-62cc21727907): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.535782 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f899g" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.548118 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.563008 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.678453 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpnb\" (UniqueName: \"kubernetes.io/projected/fe16203f-60b5-483f-83b5-1d26b25292c9-kube-api-access-8vpnb\") pod \"fe16203f-60b5-483f-83b5-1d26b25292c9\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.678938 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-config-data\") pod \"5e952a92-5166-407d-814d-10160a0eb033\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.679588 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-config-data" (OuterVolumeSpecName: "config-data") pod "5e952a92-5166-407d-814d-10160a0eb033" (UID: "5e952a92-5166-407d-814d-10160a0eb033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.679679 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-scripts\") pod \"5e952a92-5166-407d-814d-10160a0eb033\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680266 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-scripts" (OuterVolumeSpecName: "scripts") pod "5e952a92-5166-407d-814d-10160a0eb033" (UID: "5e952a92-5166-407d-814d-10160a0eb033"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680329 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-config-data\") pod \"fe16203f-60b5-483f-83b5-1d26b25292c9\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680635 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e952a92-5166-407d-814d-10160a0eb033-horizon-secret-key\") pod \"5e952a92-5166-407d-814d-10160a0eb033\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680683 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-combined-ca-bundle\") pod \"fe16203f-60b5-483f-83b5-1d26b25292c9\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680707 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-scripts\") pod \"14f99834-e13f-4ddd-a11c-41b7b3907b81\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680757 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp5sm\" (UniqueName: \"kubernetes.io/projected/14f99834-e13f-4ddd-a11c-41b7b3907b81-kube-api-access-rp5sm\") pod \"14f99834-e13f-4ddd-a11c-41b7b3907b81\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680808 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62tcc\" (UniqueName: \"kubernetes.io/projected/5e952a92-5166-407d-814d-10160a0eb033-kube-api-access-62tcc\") pod \"5e952a92-5166-407d-814d-10160a0eb033\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680842 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-config-data\") pod \"14f99834-e13f-4ddd-a11c-41b7b3907b81\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.680920 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14f99834-e13f-4ddd-a11c-41b7b3907b81-horizon-secret-key\") pod \"14f99834-e13f-4ddd-a11c-41b7b3907b81\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.681502 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e952a92-5166-407d-814d-10160a0eb033-logs\") pod \"5e952a92-5166-407d-814d-10160a0eb033\" (UID: \"5e952a92-5166-407d-814d-10160a0eb033\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.681426 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-scripts" (OuterVolumeSpecName: "scripts") pod "14f99834-e13f-4ddd-a11c-41b7b3907b81" (UID: "14f99834-e13f-4ddd-a11c-41b7b3907b81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.681547 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-db-sync-config-data\") pod \"fe16203f-60b5-483f-83b5-1d26b25292c9\" (UID: \"fe16203f-60b5-483f-83b5-1d26b25292c9\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.681505 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-config-data" (OuterVolumeSpecName: "config-data") pod "14f99834-e13f-4ddd-a11c-41b7b3907b81" (UID: "14f99834-e13f-4ddd-a11c-41b7b3907b81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.682225 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e952a92-5166-407d-814d-10160a0eb033-logs" (OuterVolumeSpecName: "logs") pod "5e952a92-5166-407d-814d-10160a0eb033" (UID: "5e952a92-5166-407d-814d-10160a0eb033"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.681634 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14f99834-e13f-4ddd-a11c-41b7b3907b81-logs\") pod \"14f99834-e13f-4ddd-a11c-41b7b3907b81\" (UID: \"14f99834-e13f-4ddd-a11c-41b7b3907b81\") " Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.682636 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14f99834-e13f-4ddd-a11c-41b7b3907b81-logs" (OuterVolumeSpecName: "logs") pod "14f99834-e13f-4ddd-a11c-41b7b3907b81" (UID: "14f99834-e13f-4ddd-a11c-41b7b3907b81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.682809 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe16203f-60b5-483f-83b5-1d26b25292c9-kube-api-access-8vpnb" (OuterVolumeSpecName: "kube-api-access-8vpnb") pod "fe16203f-60b5-483f-83b5-1d26b25292c9" (UID: "fe16203f-60b5-483f-83b5-1d26b25292c9"). InnerVolumeSpecName "kube-api-access-8vpnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683343 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14f99834-e13f-4ddd-a11c-41b7b3907b81-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683366 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpnb\" (UniqueName: \"kubernetes.io/projected/fe16203f-60b5-483f-83b5-1d26b25292c9-kube-api-access-8vpnb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683379 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683389 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e952a92-5166-407d-814d-10160a0eb033-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683397 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683405 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14f99834-e13f-4ddd-a11c-41b7b3907b81-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.683414 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e952a92-5166-407d-814d-10160a0eb033-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.684367 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f99834-e13f-4ddd-a11c-41b7b3907b81-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "14f99834-e13f-4ddd-a11c-41b7b3907b81" (UID: "14f99834-e13f-4ddd-a11c-41b7b3907b81"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.685259 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f99834-e13f-4ddd-a11c-41b7b3907b81-kube-api-access-rp5sm" (OuterVolumeSpecName: "kube-api-access-rp5sm") pod "14f99834-e13f-4ddd-a11c-41b7b3907b81" (UID: "14f99834-e13f-4ddd-a11c-41b7b3907b81"). InnerVolumeSpecName "kube-api-access-rp5sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.685565 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e952a92-5166-407d-814d-10160a0eb033-kube-api-access-62tcc" (OuterVolumeSpecName: "kube-api-access-62tcc") pod "5e952a92-5166-407d-814d-10160a0eb033" (UID: "5e952a92-5166-407d-814d-10160a0eb033"). InnerVolumeSpecName "kube-api-access-62tcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.685765 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e952a92-5166-407d-814d-10160a0eb033-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5e952a92-5166-407d-814d-10160a0eb033" (UID: "5e952a92-5166-407d-814d-10160a0eb033"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.687013 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fe16203f-60b5-483f-83b5-1d26b25292c9" (UID: "fe16203f-60b5-483f-83b5-1d26b25292c9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.708993 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe16203f-60b5-483f-83b5-1d26b25292c9" (UID: "fe16203f-60b5-483f-83b5-1d26b25292c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.728157 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-config-data" (OuterVolumeSpecName: "config-data") pod "fe16203f-60b5-483f-83b5-1d26b25292c9" (UID: "fe16203f-60b5-483f-83b5-1d26b25292c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784800 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784834 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e952a92-5166-407d-814d-10160a0eb033-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784844 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784853 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp5sm\" (UniqueName: \"kubernetes.io/projected/14f99834-e13f-4ddd-a11c-41b7b3907b81-kube-api-access-rp5sm\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784862 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62tcc\" (UniqueName: \"kubernetes.io/projected/5e952a92-5166-407d-814d-10160a0eb033-kube-api-access-62tcc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784870 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14f99834-e13f-4ddd-a11c-41b7b3907b81-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:02 crc kubenswrapper[4797]: I0930 18:02:02.784880 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe16203f-60b5-483f-83b5-1d26b25292c9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.231971 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4j78k"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.241901 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4j78k"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.254041 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-549f8c559c-fcmkl" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.254036 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-549f8c559c-fcmkl" event={"ID":"5e952a92-5166-407d-814d-10160a0eb033","Type":"ContainerDied","Data":"81cf83307edf8bcbf82c5dca853613e608c668cba15319a5436c543d8c0b0d6c"} Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.255751 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644644d7d5-bvd5r" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.255769 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644644d7d5-bvd5r" event={"ID":"14f99834-e13f-4ddd-a11c-41b7b3907b81","Type":"ContainerDied","Data":"b44bcb13feb91b76dd1cbc429c867ff0429d18ce6a00f081de4f5afdcfbd1104"} Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.260920 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f899g" event={"ID":"fe16203f-60b5-483f-83b5-1d26b25292c9","Type":"ContainerDied","Data":"9eab8b9b6e738585766f4b6640833667f65fa49131767f074f6b3f413eab1c66"} Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.260961 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eab8b9b6e738585766f4b6640833667f65fa49131767f074f6b3f413eab1c66" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.261061 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f899g" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.342502 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-549f8c559c-fcmkl"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.357747 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-549f8c559c-fcmkl"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.364242 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7w9jn"] Sep 30 18:02:03 crc kubenswrapper[4797]: E0930 18:02:03.365149 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe16203f-60b5-483f-83b5-1d26b25292c9" containerName="glance-db-sync" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365186 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe16203f-60b5-483f-83b5-1d26b25292c9" containerName="glance-db-sync" Sep 30 18:02:03 crc kubenswrapper[4797]: E0930 18:02:03.365223 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="init" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365232 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="init" Sep 30 18:02:03 crc kubenswrapper[4797]: E0930 18:02:03.365246 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365252 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" Sep 30 18:02:03 crc kubenswrapper[4797]: E0930 18:02:03.365262 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb048627-f6eb-4487-8ba3-401d75c87743" containerName="keystone-bootstrap" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365269 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb048627-f6eb-4487-8ba3-401d75c87743" containerName="keystone-bootstrap" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365549 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb048627-f6eb-4487-8ba3-401d75c87743" containerName="keystone-bootstrap" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365578 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.365599 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe16203f-60b5-483f-83b5-1d26b25292c9" containerName="glance-db-sync" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.367994 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.375213 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.375138 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8tsjk" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.375371 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.378863 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.383594 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7w9jn"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.393454 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-644644d7d5-bvd5r"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.404709 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-644644d7d5-bvd5r"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.517166 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-config-data\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.517856 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-credential-keys\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.517978 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7l7r\" (UniqueName: \"kubernetes.io/projected/6162f46a-e25e-4bcf-8e84-77d28c565c1c-kube-api-access-m7l7r\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.518056 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-combined-ca-bundle\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.518551 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-fernet-keys\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.518647 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-scripts\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.620767 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7l7r\" (UniqueName: \"kubernetes.io/projected/6162f46a-e25e-4bcf-8e84-77d28c565c1c-kube-api-access-m7l7r\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.620837 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-combined-ca-bundle\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.620945 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-fernet-keys\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.621071 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-scripts\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.621135 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-config-data\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.621163 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-credential-keys\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.629806 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-combined-ca-bundle\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.629925 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-scripts\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.629999 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-config-data\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.630093 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-fernet-keys\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.643877 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-credential-keys\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.644104 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7l7r\" (UniqueName: \"kubernetes.io/projected/6162f46a-e25e-4bcf-8e84-77d28c565c1c-kube-api-access-m7l7r\") pod \"keystone-bootstrap-7w9jn\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.690395 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.862858 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pztk9"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.868067 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.902343 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pztk9"] Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.927718 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.927801 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-config\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.927843 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.927888 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.927912 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:03 crc kubenswrapper[4797]: I0930 18:02:03.927938 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9tcj\" (UniqueName: \"kubernetes.io/projected/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-kube-api-access-f9tcj\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.029096 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.029187 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.029221 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.029244 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9tcj\" (UniqueName: \"kubernetes.io/projected/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-kube-api-access-f9tcj\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.029311 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.029351 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-config\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.030376 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.030601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.031124 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.031192 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.031612 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-config\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.045247 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9tcj\" (UniqueName: \"kubernetes.io/projected/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-kube-api-access-f9tcj\") pod \"dnsmasq-dns-8b5c85b87-pztk9\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.203801 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.249981 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f99834-e13f-4ddd-a11c-41b7b3907b81" path="/var/lib/kubelet/pods/14f99834-e13f-4ddd-a11c-41b7b3907b81/volumes" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.250425 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" path="/var/lib/kubelet/pods/4d5a74b4-24b3-4369-bda9-4de7e98d9821/volumes" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.251282 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e952a92-5166-407d-814d-10160a0eb033" path="/var/lib/kubelet/pods/5e952a92-5166-407d-814d-10160a0eb033/volumes" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.251697 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb048627-f6eb-4487-8ba3-401d75c87743" path="/var/lib/kubelet/pods/cb048627-f6eb-4487-8ba3-401d75c87743/volumes" Sep 30 18:02:04 crc kubenswrapper[4797]: E0930 18:02:04.477685 4797 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 18:02:04 crc kubenswrapper[4797]: E0930 18:02:04.477842 4797 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f6sbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2zrb5_openstack(7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 18:02:04 crc kubenswrapper[4797]: E0930 18:02:04.479072 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2zrb5" podUID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.519478 4797 scope.go:117] "RemoveContainer" containerID="1a4df30479adbfbac9241512851b9d7d3e8e0b517821f5d148d329a156313f8f" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.550472 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4hzw4" podUID="4d5a74b4-24b3-4369-bda9-4de7e98d9821" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.609009 4797 scope.go:117] "RemoveContainer" containerID="2c1b281620c6f953d650ed9a9d2e4f82b0bd604d75e8f72ebde3cac8949042c6" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.727215 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.727654 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.769315 4797 scope.go:117] "RemoveContainer" containerID="e70e34b1e2f2abdb7afeaa78eb3df112af3e91dc117ecf5a8078b286412a83b4" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.788916 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a08b307-3f62-4868-b20c-f43e337c5481-horizon-secret-key\") pod \"4a08b307-3f62-4868-b20c-f43e337c5481\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789061 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08b307-3f62-4868-b20c-f43e337c5481-logs\") pod \"4a08b307-3f62-4868-b20c-f43e337c5481\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789111 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sm2p\" (UniqueName: \"kubernetes.io/projected/2deac399-0305-404a-bf66-cc9d4e122a3a-kube-api-access-2sm2p\") pod \"2deac399-0305-404a-bf66-cc9d4e122a3a\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789149 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-combined-ca-bundle\") pod \"2deac399-0305-404a-bf66-cc9d4e122a3a\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789315 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-scripts\") pod \"4a08b307-3f62-4868-b20c-f43e337c5481\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789648 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9qr\" (UniqueName: \"kubernetes.io/projected/4a08b307-3f62-4868-b20c-f43e337c5481-kube-api-access-7w9qr\") pod \"4a08b307-3f62-4868-b20c-f43e337c5481\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789733 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-config\") pod \"2deac399-0305-404a-bf66-cc9d4e122a3a\" (UID: \"2deac399-0305-404a-bf66-cc9d4e122a3a\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.789758 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-config-data\") pod \"4a08b307-3f62-4868-b20c-f43e337c5481\" (UID: \"4a08b307-3f62-4868-b20c-f43e337c5481\") " Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.791330 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-config-data" (OuterVolumeSpecName: "config-data") pod "4a08b307-3f62-4868-b20c-f43e337c5481" (UID: "4a08b307-3f62-4868-b20c-f43e337c5481"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.795299 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-scripts" (OuterVolumeSpecName: "scripts") pod "4a08b307-3f62-4868-b20c-f43e337c5481" (UID: "4a08b307-3f62-4868-b20c-f43e337c5481"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.800690 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a08b307-3f62-4868-b20c-f43e337c5481-logs" (OuterVolumeSpecName: "logs") pod "4a08b307-3f62-4868-b20c-f43e337c5481" (UID: "4a08b307-3f62-4868-b20c-f43e337c5481"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.800711 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2deac399-0305-404a-bf66-cc9d4e122a3a-kube-api-access-2sm2p" (OuterVolumeSpecName: "kube-api-access-2sm2p") pod "2deac399-0305-404a-bf66-cc9d4e122a3a" (UID: "2deac399-0305-404a-bf66-cc9d4e122a3a"). InnerVolumeSpecName "kube-api-access-2sm2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.814381 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a08b307-3f62-4868-b20c-f43e337c5481-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4a08b307-3f62-4868-b20c-f43e337c5481" (UID: "4a08b307-3f62-4868-b20c-f43e337c5481"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.834715 4797 scope.go:117] "RemoveContainer" containerID="43f46b1243691b0907b8cac380cab5f530b7bbfce39aa59afb43e44cfd1d3db1" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.835226 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a08b307-3f62-4868-b20c-f43e337c5481-kube-api-access-7w9qr" (OuterVolumeSpecName: "kube-api-access-7w9qr") pod "4a08b307-3f62-4868-b20c-f43e337c5481" (UID: "4a08b307-3f62-4868-b20c-f43e337c5481"). InnerVolumeSpecName "kube-api-access-7w9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.891638 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08b307-3f62-4868-b20c-f43e337c5481-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.891662 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sm2p\" (UniqueName: \"kubernetes.io/projected/2deac399-0305-404a-bf66-cc9d4e122a3a-kube-api-access-2sm2p\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.891673 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.891683 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9qr\" (UniqueName: \"kubernetes.io/projected/4a08b307-3f62-4868-b20c-f43e337c5481-kube-api-access-7w9qr\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.891692 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a08b307-3f62-4868-b20c-f43e337c5481-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.891700 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a08b307-3f62-4868-b20c-f43e337c5481-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.912595 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:04 crc kubenswrapper[4797]: E0930 18:02:04.912951 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2deac399-0305-404a-bf66-cc9d4e122a3a" containerName="neutron-db-sync" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.912966 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2deac399-0305-404a-bf66-cc9d4e122a3a" containerName="neutron-db-sync" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.913161 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2deac399-0305-404a-bf66-cc9d4e122a3a" containerName="neutron-db-sync" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.916128 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.927185 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.927654 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n2zcg" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.927929 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.992115 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.998372 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-config-data\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.998733 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cxj\" (UniqueName: \"kubernetes.io/projected/b12faf91-f7bd-4a27-9829-34a931141b7e-kube-api-access-24cxj\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.999016 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.999142 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.999295 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.999519 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-scripts\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:04 crc kubenswrapper[4797]: I0930 18:02:04.999657 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-logs\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.041782 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2deac399-0305-404a-bf66-cc9d4e122a3a" (UID: "2deac399-0305-404a-bf66-cc9d4e122a3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.059010 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-config" (OuterVolumeSpecName: "config") pod "2deac399-0305-404a-bf66-cc9d4e122a3a" (UID: "2deac399-0305-404a-bf66-cc9d4e122a3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.073397 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.078015 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.081354 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.101322 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24cxj\" (UniqueName: \"kubernetes.io/projected/b12faf91-f7bd-4a27-9829-34a931141b7e-kube-api-access-24cxj\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.102300 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.102848 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.102786 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.103152 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.103758 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.102963 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.131158 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ff7898f76-hfsxf"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.130946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-scripts\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.131840 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-logs\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.144379 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-config-data\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.144827 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.145759 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2deac399-0305-404a-bf66-cc9d4e122a3a-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.132788 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-logs\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.148848 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-scripts\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.154706 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6676d4ddcd-sxf6l"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.178539 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-config-data\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.201928 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.220803 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.249008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.249398 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.249552 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.249724 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-logs\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.249834 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.249969 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.250090 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nl5q\" (UniqueName: \"kubernetes.io/projected/592ad170-d90d-432f-a862-196f26e12d58-kube-api-access-9nl5q\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.250148 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24cxj\" (UniqueName: \"kubernetes.io/projected/b12faf91-f7bd-4a27-9829-34a931141b7e-kube-api-access-24cxj\") pod \"glance-default-external-api-0\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.308797 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6676d4ddcd-sxf6l" event={"ID":"04e30fb7-7876-4a90-b887-05b7da2f7746","Type":"ContainerStarted","Data":"7e9be64d2e0d1943ac8a37e1ae42197f277c41bd3e8529d74a5f875c729d6833"} Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.309721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff7898f76-hfsxf" event={"ID":"e43627c6-a815-4487-b13d-ff9a402fa860","Type":"ContainerStarted","Data":"c94794a1acd4f98b3642b69d428b6178b9427c0d83ee1e4aab018c545082bf62"} Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.312577 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758c546b97-2hd7n" event={"ID":"4a08b307-3f62-4868-b20c-f43e337c5481","Type":"ContainerDied","Data":"787d4fc75ba033f5ec1eb11455603c645f4cad563e496609839bac77254da23d"} Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.312666 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758c546b97-2hd7n" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.323742 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vlvs2" event={"ID":"2deac399-0305-404a-bf66-cc9d4e122a3a","Type":"ContainerDied","Data":"2b1502cc709162c05e33c02d20ac8403820616d4fd26088af13d832a343b58c9"} Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.323777 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1502cc709162c05e33c02d20ac8403820616d4fd26088af13d832a343b58c9" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.323856 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vlvs2" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.335579 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"2917b8990bf356e8e1ce6fe1b3ce1f29f0b790b2ad003c6a1e85f4a96a1de3ae"} Sep 30 18:02:05 crc kubenswrapper[4797]: E0930 18:02:05.337248 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2zrb5" podUID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359181 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-logs\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359232 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359268 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359303 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nl5q\" (UniqueName: \"kubernetes.io/projected/592ad170-d90d-432f-a862-196f26e12d58-kube-api-access-9nl5q\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359374 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359409 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.359447 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.360732 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.363644 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.365154 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-logs\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.372519 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.380121 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.381855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nl5q\" (UniqueName: \"kubernetes.io/projected/592ad170-d90d-432f-a862-196f26e12d58-kube-api-access-9nl5q\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.388495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.450344 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.468494 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.534280 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.543837 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.554469 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7w9jn"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.582940 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758c546b97-2hd7n"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.593329 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-758c546b97-2hd7n"] Sep 30 18:02:05 crc kubenswrapper[4797]: I0930 18:02:05.615499 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pztk9"] Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.023797 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pztk9"] Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.082989 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-mtmnm"] Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.088291 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.131501 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-mtmnm"] Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.185235 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74f97446fb-hwg4l"] Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.187556 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.189604 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.189665 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkbv4\" (UniqueName: \"kubernetes.io/projected/a263de13-afff-4c1b-8b43-8ecfac6c9855-kube-api-access-hkbv4\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.189702 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.189756 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-config\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.189827 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.189850 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.191269 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.191414 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.191593 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vrxmm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.191737 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.208965 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74f97446fb-hwg4l"] Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.278585 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a08b307-3f62-4868-b20c-f43e337c5481" path="/var/lib/kubelet/pods/4a08b307-3f62-4868-b20c-f43e337c5481/volumes" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295489 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qt25\" (UniqueName: \"kubernetes.io/projected/2152a741-6045-4223-8d2a-9a1c24191d99-kube-api-access-2qt25\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295607 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-config\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295658 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-combined-ca-bundle\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295709 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkbv4\" (UniqueName: \"kubernetes.io/projected/a263de13-afff-4c1b-8b43-8ecfac6c9855-kube-api-access-hkbv4\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295735 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295775 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-config\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295798 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-httpd-config\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295822 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-ovndb-tls-certs\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.295877 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.296847 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.297589 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.298234 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.298581 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.299521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-config\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.344650 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkbv4\" (UniqueName: \"kubernetes.io/projected/a263de13-afff-4c1b-8b43-8ecfac6c9855-kube-api-access-hkbv4\") pod \"dnsmasq-dns-84b966f6c9-mtmnm\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.362110 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2q6s8" event={"ID":"e7b2a372-883f-4418-9939-6de336219cb8","Type":"ContainerStarted","Data":"5cbf1fa59249f612abc40fd5a72b59c307cb46eaed7b67111ff0c7429004bf0c"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.385713 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b4daf73b-c17d-4879-b82a-c7eca9bafbab","Type":"ContainerStarted","Data":"ab4cec5bc2c2a3091b7a39a4c36254544fc5c893fb1334ac5ac721112fc4464c"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.385778 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b4daf73b-c17d-4879-b82a-c7eca9bafbab","Type":"ContainerStarted","Data":"f04df4542a24144659a5be64fb5300304deb7d9ac5c01fe97cca9950ae96ce5c"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.388932 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2q6s8" podStartSLOduration=4.161654708 podStartE2EDuration="40.388910746s" podCreationTimestamp="2025-09-30 18:01:26 +0000 UTC" firstStartedPulling="2025-09-30 18:01:28.316101993 +0000 UTC m=+1138.838601231" lastFinishedPulling="2025-09-30 18:02:04.543358031 +0000 UTC m=+1175.065857269" observedRunningTime="2025-09-30 18:02:06.379844759 +0000 UTC m=+1176.902343997" watchObservedRunningTime="2025-09-30 18:02:06.388910746 +0000 UTC m=+1176.911409984" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.397422 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qt25\" (UniqueName: \"kubernetes.io/projected/2152a741-6045-4223-8d2a-9a1c24191d99-kube-api-access-2qt25\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.397555 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-config\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.397590 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-combined-ca-bundle\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.397688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-httpd-config\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.397724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-ovndb-tls-certs\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.402912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" event={"ID":"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4","Type":"ContainerStarted","Data":"fa3ea813be65501efdb706a7df5d2891fca2fe34afc8e95fea9bb99be4238daf"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.406972 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-combined-ca-bundle\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.407207 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-ovndb-tls-certs\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.410544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af798459-89f2-474d-9082-eee9e1712e86","Type":"ContainerStarted","Data":"554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.419163 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-httpd-config\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.424150 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-config\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.436890 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7w9jn" event={"ID":"6162f46a-e25e-4bcf-8e84-77d28c565c1c","Type":"ContainerStarted","Data":"dbf57f337765aaf465722aff3d0898be3e5eaf84178aabf29284c91a5a4e5121"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.437844 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qt25\" (UniqueName: \"kubernetes.io/projected/2152a741-6045-4223-8d2a-9a1c24191d99-kube-api-access-2qt25\") pod \"neutron-74f97446fb-hwg4l\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.445102 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a","Type":"ContainerStarted","Data":"4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310"} Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.451395 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=7.206313786 podStartE2EDuration="41.45136631s" podCreationTimestamp="2025-09-30 18:01:25 +0000 UTC" firstStartedPulling="2025-09-30 18:01:27.626758307 +0000 UTC m=+1138.149257545" lastFinishedPulling="2025-09-30 18:02:01.871810831 +0000 UTC m=+1172.394310069" observedRunningTime="2025-09-30 18:02:06.447808963 +0000 UTC m=+1176.970308201" watchObservedRunningTime="2025-09-30 18:02:06.45136631 +0000 UTC m=+1176.973865548" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.475950 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.487559 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=7.529022475 podStartE2EDuration="41.487540808s" podCreationTimestamp="2025-09-30 18:01:25 +0000 UTC" firstStartedPulling="2025-09-30 18:01:27.911261783 +0000 UTC m=+1138.433761021" lastFinishedPulling="2025-09-30 18:02:01.869780096 +0000 UTC m=+1172.392279354" observedRunningTime="2025-09-30 18:02:06.481145803 +0000 UTC m=+1177.003645041" watchObservedRunningTime="2025-09-30 18:02:06.487540808 +0000 UTC m=+1177.010040036" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.532180 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.548727 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.584074 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.603922 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.603977 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 18:02:06 crc kubenswrapper[4797]: I0930 18:02:06.725458 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.125209 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-mtmnm"] Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.185886 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.305635 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.458018 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7w9jn" event={"ID":"6162f46a-e25e-4bcf-8e84-77d28c565c1c","Type":"ContainerStarted","Data":"0a3d51a5d0e7481e5aa6b162ab3b98975b6c21a692ec800cc4369e1149c4b43f"} Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.460845 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b4daf73b-c17d-4879-b82a-c7eca9bafbab","Type":"ContainerStarted","Data":"25dc105cb75534d271a244678b5ef647b258f1830f9e257c231aa85b1a9cd98b"} Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.465466 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" event={"ID":"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4","Type":"ContainerStarted","Data":"1927a1a1401275d5de3568099d7284988ecc851266c150b008ae833888948353"} Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.465806 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" podUID="590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" containerName="init" containerID="cri-o://1927a1a1401275d5de3568099d7284988ecc851266c150b008ae833888948353" gracePeriod=10 Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.468421 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" event={"ID":"a263de13-afff-4c1b-8b43-8ecfac6c9855","Type":"ContainerStarted","Data":"77569fc9a44ec6de606022d1154d685ddad58210d74c8d3a752fab8bfb7b95bf"} Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.470248 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.494990 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7w9jn" podStartSLOduration=4.494964996 podStartE2EDuration="4.494964996s" podCreationTimestamp="2025-09-30 18:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:07.479473712 +0000 UTC m=+1178.001972950" watchObservedRunningTime="2025-09-30 18:02:07.494964996 +0000 UTC m=+1178.017464234" Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.535897 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.576365 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.627012 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:02:07 crc kubenswrapper[4797]: I0930 18:02:07.646496 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:02:08 crc kubenswrapper[4797]: I0930 18:02:08.493814 4797 generic.go:334] "Generic (PLEG): container finished" podID="590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" containerID="1927a1a1401275d5de3568099d7284988ecc851266c150b008ae833888948353" exitCode=0 Sep 30 18:02:08 crc kubenswrapper[4797]: I0930 18:02:08.494279 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" event={"ID":"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4","Type":"ContainerDied","Data":"1927a1a1401275d5de3568099d7284988ecc851266c150b008ae833888948353"} Sep 30 18:02:08 crc kubenswrapper[4797]: I0930 18:02:08.496983 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 18:02:08 crc kubenswrapper[4797]: I0930 18:02:08.522802 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=21.52277014 podStartE2EDuration="21.52277014s" podCreationTimestamp="2025-09-30 18:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:08.520851977 +0000 UTC m=+1179.043351225" watchObservedRunningTime="2025-09-30 18:02:08.52277014 +0000 UTC m=+1179.045269388" Sep 30 18:02:08 crc kubenswrapper[4797]: I0930 18:02:08.686909 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74f97446fb-hwg4l"] Sep 30 18:02:09 crc kubenswrapper[4797]: I0930 18:02:09.407656 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:09 crc kubenswrapper[4797]: I0930 18:02:09.504607 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" containerID="cri-o://4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" gracePeriod=30 Sep 30 18:02:09 crc kubenswrapper[4797]: I0930 18:02:09.506032 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="af798459-89f2-474d-9082-eee9e1712e86" containerName="watcher-decision-engine" containerID="cri-o://554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88" gracePeriod=30 Sep 30 18:02:09 crc kubenswrapper[4797]: I0930 18:02:09.621498 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:09 crc kubenswrapper[4797]: W0930 18:02:09.743740 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12faf91_f7bd_4a27_9829_34a931141b7e.slice/crio-b32c8738108811e1794a01a610b52d0330a1e22cba8536944316cfa51914c153 WatchSource:0}: Error finding container b32c8738108811e1794a01a610b52d0330a1e22cba8536944316cfa51914c153: Status 404 returned error can't find the container with id b32c8738108811e1794a01a610b52d0330a1e22cba8536944316cfa51914c153 Sep 30 18:02:09 crc kubenswrapper[4797]: W0930 18:02:09.747805 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2152a741_6045_4223_8d2a_9a1c24191d99.slice/crio-fd1e37a6a7ac74fd4dab6e106e2a67c1b2a77f90cefeda57adc5543d681ce73c WatchSource:0}: Error finding container fd1e37a6a7ac74fd4dab6e106e2a67c1b2a77f90cefeda57adc5543d681ce73c: Status 404 returned error can't find the container with id fd1e37a6a7ac74fd4dab6e106e2a67c1b2a77f90cefeda57adc5543d681ce73c Sep 30 18:02:09 crc kubenswrapper[4797]: W0930 18:02:09.751551 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod592ad170_d90d_432f_a862_196f26e12d58.slice/crio-40c72747e3cc8d6063fb30d63b3528835819066938c56cbeb9654a466ab8cae0 WatchSource:0}: Error finding container 40c72747e3cc8d6063fb30d63b3528835819066938c56cbeb9654a466ab8cae0: Status 404 returned error can't find the container with id 40c72747e3cc8d6063fb30d63b3528835819066938c56cbeb9654a466ab8cae0 Sep 30 18:02:09 crc kubenswrapper[4797]: I0930 18:02:09.933830 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.079054 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-swift-storage-0\") pod \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.079110 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9tcj\" (UniqueName: \"kubernetes.io/projected/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-kube-api-access-f9tcj\") pod \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.079173 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-svc\") pod \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.079257 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-config\") pod \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.079319 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-nb\") pod \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.079333 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-sb\") pod \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\" (UID: \"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4\") " Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.123754 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" (UID: "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.125942 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-kube-api-access-f9tcj" (OuterVolumeSpecName: "kube-api-access-f9tcj") pod "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" (UID: "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4"). InnerVolumeSpecName "kube-api-access-f9tcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.128371 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" (UID: "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.152538 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" (UID: "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.156218 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-config" (OuterVolumeSpecName: "config") pod "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" (UID: "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.182105 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.182194 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.182210 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.182221 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9tcj\" (UniqueName: \"kubernetes.io/projected/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-kube-api-access-f9tcj\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.182232 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.193392 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" (UID: "590cf7bf-19f5-4d56-a2b7-ea9faf570fc4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.293812 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.349243 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fc4d4d55c-fzms2"] Sep 30 18:02:10 crc kubenswrapper[4797]: E0930 18:02:10.350067 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" containerName="init" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.350090 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" containerName="init" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.350539 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" containerName="init" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.352493 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.355409 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.357176 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.421759 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fc4d4d55c-fzms2"] Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511003 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-httpd-config\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511085 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w85c\" (UniqueName: \"kubernetes.io/projected/6726c7e0-d359-494e-9a9b-54c878d16e6b-kube-api-access-4w85c\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511216 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-ovndb-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511254 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-public-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511304 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-combined-ca-bundle\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-config\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.511377 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-internal-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.530257 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b12faf91-f7bd-4a27-9829-34a931141b7e","Type":"ContainerStarted","Data":"b32c8738108811e1794a01a610b52d0330a1e22cba8536944316cfa51914c153"} Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.540258 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74f97446fb-hwg4l" event={"ID":"2152a741-6045-4223-8d2a-9a1c24191d99","Type":"ContainerStarted","Data":"fd1e37a6a7ac74fd4dab6e106e2a67c1b2a77f90cefeda57adc5543d681ce73c"} Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.559311 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"592ad170-d90d-432f-a862-196f26e12d58","Type":"ContainerStarted","Data":"40c72747e3cc8d6063fb30d63b3528835819066938c56cbeb9654a466ab8cae0"} Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.566114 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" event={"ID":"590cf7bf-19f5-4d56-a2b7-ea9faf570fc4","Type":"ContainerDied","Data":"fa3ea813be65501efdb706a7df5d2891fca2fe34afc8e95fea9bb99be4238daf"} Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.566184 4797 scope.go:117] "RemoveContainer" containerID="1927a1a1401275d5de3568099d7284988ecc851266c150b008ae833888948353" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.566227 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pztk9" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.616054 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w85c\" (UniqueName: \"kubernetes.io/projected/6726c7e0-d359-494e-9a9b-54c878d16e6b-kube-api-access-4w85c\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.616564 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-ovndb-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.617121 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-public-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.617176 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-combined-ca-bundle\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.617222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-config\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.617250 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-internal-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.617295 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-httpd-config\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.622055 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-combined-ca-bundle\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.622473 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-ovndb-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.628694 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-public-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.628769 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-internal-tls-certs\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.629259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-httpd-config\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.629567 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6726c7e0-d359-494e-9a9b-54c878d16e6b-config\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.654020 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w85c\" (UniqueName: \"kubernetes.io/projected/6726c7e0-d359-494e-9a9b-54c878d16e6b-kube-api-access-4w85c\") pod \"neutron-6fc4d4d55c-fzms2\" (UID: \"6726c7e0-d359-494e-9a9b-54c878d16e6b\") " pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.683251 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.747788 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pztk9"] Sep 30 18:02:10 crc kubenswrapper[4797]: I0930 18:02:10.761100 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pztk9"] Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.242954 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.438276 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fc4d4d55c-fzms2"] Sep 30 18:02:11 crc kubenswrapper[4797]: W0930 18:02:11.449493 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6726c7e0_d359_494e_9a9b_54c878d16e6b.slice/crio-d6db0388bd83cdf54e47e87d1b6934d3d13a262f2f7cfb7511190f2f36f3c5b4 WatchSource:0}: Error finding container d6db0388bd83cdf54e47e87d1b6934d3d13a262f2f7cfb7511190f2f36f3c5b4: Status 404 returned error can't find the container with id d6db0388bd83cdf54e47e87d1b6934d3d13a262f2f7cfb7511190f2f36f3c5b4 Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.612045 4797 generic.go:334] "Generic (PLEG): container finished" podID="e7b2a372-883f-4418-9939-6de336219cb8" containerID="5cbf1fa59249f612abc40fd5a72b59c307cb46eaed7b67111ff0c7429004bf0c" exitCode=0 Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.612126 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2q6s8" event={"ID":"e7b2a372-883f-4418-9939-6de336219cb8","Type":"ContainerDied","Data":"5cbf1fa59249f612abc40fd5a72b59c307cb46eaed7b67111ff0c7429004bf0c"} Sep 30 18:02:11 crc kubenswrapper[4797]: E0930 18:02:11.614778 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:11 crc kubenswrapper[4797]: E0930 18:02:11.625307 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.626200 4797 generic.go:334] "Generic (PLEG): container finished" podID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerID="653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f" exitCode=0 Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.626330 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" event={"ID":"a263de13-afff-4c1b-8b43-8ecfac6c9855","Type":"ContainerDied","Data":"653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f"} Sep 30 18:02:11 crc kubenswrapper[4797]: E0930 18:02:11.634612 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:11 crc kubenswrapper[4797]: E0930 18:02:11.634754 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.644706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6676d4ddcd-sxf6l" event={"ID":"04e30fb7-7876-4a90-b887-05b7da2f7746","Type":"ContainerStarted","Data":"36a6adba377bc9ef5681d5d89a6434eaf7eb2d4e03e624ee8cb987e2724b3e14"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.644750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6676d4ddcd-sxf6l" event={"ID":"04e30fb7-7876-4a90-b887-05b7da2f7746","Type":"ContainerStarted","Data":"75092a15524b958877a426a941590df41936c9407fcdc33bddc781170c5791b6"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.669795 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"592ad170-d90d-432f-a862-196f26e12d58","Type":"ContainerStarted","Data":"4da4408872f36a9a255e9ad81bc028da42b6c152b6104356d541f19f61fde6e2"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.694887 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff7898f76-hfsxf" event={"ID":"e43627c6-a815-4487-b13d-ff9a402fa860","Type":"ContainerStarted","Data":"ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.695198 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff7898f76-hfsxf" event={"ID":"e43627c6-a815-4487-b13d-ff9a402fa860","Type":"ContainerStarted","Data":"28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.699613 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fc4d4d55c-fzms2" event={"ID":"6726c7e0-d359-494e-9a9b-54c878d16e6b","Type":"ContainerStarted","Data":"d6db0388bd83cdf54e47e87d1b6934d3d13a262f2f7cfb7511190f2f36f3c5b4"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.705772 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerStarted","Data":"6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.710379 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6676d4ddcd-sxf6l" podStartSLOduration=32.147697147 podStartE2EDuration="36.710361516s" podCreationTimestamp="2025-09-30 18:01:35 +0000 UTC" firstStartedPulling="2025-09-30 18:02:05.193285131 +0000 UTC m=+1175.715784369" lastFinishedPulling="2025-09-30 18:02:09.7559495 +0000 UTC m=+1180.278448738" observedRunningTime="2025-09-30 18:02:11.694723639 +0000 UTC m=+1182.217222877" watchObservedRunningTime="2025-09-30 18:02:11.710361516 +0000 UTC m=+1182.232860754" Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.721816 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b12faf91-f7bd-4a27-9829-34a931141b7e","Type":"ContainerStarted","Data":"3a5ccafd363aa07a7ba63d2c87e4375de2bae4a4820ef0aba0e41a77ec9e3189"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.730705 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74f97446fb-hwg4l" event={"ID":"2152a741-6045-4223-8d2a-9a1c24191d99","Type":"ContainerStarted","Data":"df75fae43cfd850b1992da0675a8654efd4971a9b346815a07670649b0aeb8ba"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.730750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74f97446fb-hwg4l" event={"ID":"2152a741-6045-4223-8d2a-9a1c24191d99","Type":"ContainerStarted","Data":"c080b2266602c152fc73c4139b8eed2322891325c1ced93ea64e972e3f4266e5"} Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.730847 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.737584 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-ff7898f76-hfsxf" podStartSLOduration=32.091608947 podStartE2EDuration="36.737562958s" podCreationTimestamp="2025-09-30 18:01:35 +0000 UTC" firstStartedPulling="2025-09-30 18:02:05.193722934 +0000 UTC m=+1175.716222172" lastFinishedPulling="2025-09-30 18:02:09.839676945 +0000 UTC m=+1180.362176183" observedRunningTime="2025-09-30 18:02:11.718649162 +0000 UTC m=+1182.241148400" watchObservedRunningTime="2025-09-30 18:02:11.737562958 +0000 UTC m=+1182.260062196" Sep 30 18:02:11 crc kubenswrapper[4797]: I0930 18:02:11.770141 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-74f97446fb-hwg4l" podStartSLOduration=5.770118527 podStartE2EDuration="5.770118527s" podCreationTimestamp="2025-09-30 18:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:11.755820236 +0000 UTC m=+1182.278319464" watchObservedRunningTime="2025-09-30 18:02:11.770118527 +0000 UTC m=+1182.292617755" Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.254681 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590cf7bf-19f5-4d56-a2b7-ea9faf570fc4" path="/var/lib/kubelet/pods/590cf7bf-19f5-4d56-a2b7-ea9faf570fc4/volumes" Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.470035 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.768917 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" event={"ID":"a263de13-afff-4c1b-8b43-8ecfac6c9855","Type":"ContainerStarted","Data":"4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891"} Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.770986 4797 generic.go:334] "Generic (PLEG): container finished" podID="6162f46a-e25e-4bcf-8e84-77d28c565c1c" containerID="0a3d51a5d0e7481e5aa6b162ab3b98975b6c21a692ec800cc4369e1149c4b43f" exitCode=0 Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.771039 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7w9jn" event={"ID":"6162f46a-e25e-4bcf-8e84-77d28c565c1c","Type":"ContainerDied","Data":"0a3d51a5d0e7481e5aa6b162ab3b98975b6c21a692ec800cc4369e1149c4b43f"} Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.782170 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fc4d4d55c-fzms2" event={"ID":"6726c7e0-d359-494e-9a9b-54c878d16e6b","Type":"ContainerStarted","Data":"e6a34828e8b40af83db4ff7cde9adbbca064452689e4a2538f7203748db7d0cb"} Sep 30 18:02:12 crc kubenswrapper[4797]: I0930 18:02:12.782209 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fc4d4d55c-fzms2" event={"ID":"6726c7e0-d359-494e-9a9b-54c878d16e6b","Type":"ContainerStarted","Data":"691f3be87b3558e26ec30b50f4c92115a892e001b72db2a5ccbada012d311196"} Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.202341 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2q6s8" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.288351 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-scripts\") pod \"e7b2a372-883f-4418-9939-6de336219cb8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.288524 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqrkp\" (UniqueName: \"kubernetes.io/projected/e7b2a372-883f-4418-9939-6de336219cb8-kube-api-access-fqrkp\") pod \"e7b2a372-883f-4418-9939-6de336219cb8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.288616 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-combined-ca-bundle\") pod \"e7b2a372-883f-4418-9939-6de336219cb8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.288701 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-config-data\") pod \"e7b2a372-883f-4418-9939-6de336219cb8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.288734 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b2a372-883f-4418-9939-6de336219cb8-logs\") pod \"e7b2a372-883f-4418-9939-6de336219cb8\" (UID: \"e7b2a372-883f-4418-9939-6de336219cb8\") " Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.289601 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b2a372-883f-4418-9939-6de336219cb8-logs" (OuterVolumeSpecName: "logs") pod "e7b2a372-883f-4418-9939-6de336219cb8" (UID: "e7b2a372-883f-4418-9939-6de336219cb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.295878 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-scripts" (OuterVolumeSpecName: "scripts") pod "e7b2a372-883f-4418-9939-6de336219cb8" (UID: "e7b2a372-883f-4418-9939-6de336219cb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.301559 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b2a372-883f-4418-9939-6de336219cb8-kube-api-access-fqrkp" (OuterVolumeSpecName: "kube-api-access-fqrkp") pod "e7b2a372-883f-4418-9939-6de336219cb8" (UID: "e7b2a372-883f-4418-9939-6de336219cb8"). InnerVolumeSpecName "kube-api-access-fqrkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.328653 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7b2a372-883f-4418-9939-6de336219cb8" (UID: "e7b2a372-883f-4418-9939-6de336219cb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.353861 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-config-data" (OuterVolumeSpecName: "config-data") pod "e7b2a372-883f-4418-9939-6de336219cb8" (UID: "e7b2a372-883f-4418-9939-6de336219cb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.390892 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.390932 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.390943 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7b2a372-883f-4418-9939-6de336219cb8-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.390955 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b2a372-883f-4418-9939-6de336219cb8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.390966 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqrkp\" (UniqueName: \"kubernetes.io/projected/e7b2a372-883f-4418-9939-6de336219cb8-kube-api-access-fqrkp\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.804610 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b12faf91-f7bd-4a27-9829-34a931141b7e","Type":"ContainerStarted","Data":"62f2f44fc288da505a801b8b169a87091f0df8b405570b1cb41808d2473ddc86"} Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.805793 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-log" containerID="cri-o://3a5ccafd363aa07a7ba63d2c87e4375de2bae4a4820ef0aba0e41a77ec9e3189" gracePeriod=30 Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.806279 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-httpd" containerID="cri-o://62f2f44fc288da505a801b8b169a87091f0df8b405570b1cb41808d2473ddc86" gracePeriod=30 Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.830046 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c47f4984-nxfz7"] Sep 30 18:02:13 crc kubenswrapper[4797]: E0930 18:02:13.830625 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b2a372-883f-4418-9939-6de336219cb8" containerName="placement-db-sync" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.830699 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b2a372-883f-4418-9939-6de336219cb8" containerName="placement-db-sync" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.830957 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b2a372-883f-4418-9939-6de336219cb8" containerName="placement-db-sync" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.831966 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.845285 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.845513 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.849051 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"592ad170-d90d-432f-a862-196f26e12d58","Type":"ContainerStarted","Data":"3e30b41d6dd6f435cf3ad2a5bd3b365151f30b61db5683352c9e20c9bfc92d79"} Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.861363 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-log" containerID="cri-o://4da4408872f36a9a255e9ad81bc028da42b6c152b6104356d541f19f61fde6e2" gracePeriod=30 Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.861528 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-httpd" containerID="cri-o://3e30b41d6dd6f435cf3ad2a5bd3b365151f30b61db5683352c9e20c9bfc92d79" gracePeriod=30 Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.864354 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2q6s8" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.864650 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2q6s8" event={"ID":"e7b2a372-883f-4418-9939-6de336219cb8","Type":"ContainerDied","Data":"9be1ad00f28c67a4eeedb62ac0aa21f082d8ff400fdd6846a488a2795ca7913d"} Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.864724 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9be1ad00f28c67a4eeedb62ac0aa21f082d8ff400fdd6846a488a2795ca7913d" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.867763 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.889011 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c47f4984-nxfz7"] Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907351 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-scripts\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907423 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-combined-ca-bundle\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907465 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-config-data\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907592 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-internal-tls-certs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907674 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hfs\" (UniqueName: \"kubernetes.io/projected/cd02f0fa-36e8-4676-802c-37127e022ad0-kube-api-access-t5hfs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907717 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-public-tls-certs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.907745 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd02f0fa-36e8-4676-802c-37127e022ad0-logs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.911641 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.888971141 podStartE2EDuration="10.888971141s" podCreationTimestamp="2025-09-30 18:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:13.845788222 +0000 UTC m=+1184.368287460" watchObservedRunningTime="2025-09-30 18:02:13.888971141 +0000 UTC m=+1184.411470379" Sep 30 18:02:13 crc kubenswrapper[4797]: I0930 18:02:13.978330 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" podStartSLOduration=7.978312619 podStartE2EDuration="7.978312619s" podCreationTimestamp="2025-09-30 18:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:13.911911298 +0000 UTC m=+1184.434410536" watchObservedRunningTime="2025-09-30 18:02:13.978312619 +0000 UTC m=+1184.500811857" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:13.998153 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fc4d4d55c-fzms2" podStartSLOduration=3.99813126 podStartE2EDuration="3.99813126s" podCreationTimestamp="2025-09-30 18:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:13.952960677 +0000 UTC m=+1184.475459915" watchObservedRunningTime="2025-09-30 18:02:13.99813126 +0000 UTC m=+1184.520630498" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010287 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-scripts\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010342 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-combined-ca-bundle\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010360 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-config-data\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-internal-tls-certs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010534 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hfs\" (UniqueName: \"kubernetes.io/projected/cd02f0fa-36e8-4676-802c-37127e022ad0-kube-api-access-t5hfs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010575 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-public-tls-certs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.010597 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd02f0fa-36e8-4676-802c-37127e022ad0-logs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.019057 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.019036761 podStartE2EDuration="11.019036761s" podCreationTimestamp="2025-09-30 18:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:13.977185319 +0000 UTC m=+1184.499684557" watchObservedRunningTime="2025-09-30 18:02:14.019036761 +0000 UTC m=+1184.541535999" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.022260 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-scripts\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.024019 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-combined-ca-bundle\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.024803 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd02f0fa-36e8-4676-802c-37127e022ad0-logs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.025791 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-config-data\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.035240 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-internal-tls-certs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.035418 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd02f0fa-36e8-4676-802c-37127e022ad0-public-tls-certs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.047569 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hfs\" (UniqueName: \"kubernetes.io/projected/cd02f0fa-36e8-4676-802c-37127e022ad0-kube-api-access-t5hfs\") pod \"placement-c47f4984-nxfz7\" (UID: \"cd02f0fa-36e8-4676-802c-37127e022ad0\") " pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.185492 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.341602 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.421155 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-credential-keys\") pod \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.421237 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-combined-ca-bundle\") pod \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.421322 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-scripts\") pod \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.421399 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-config-data\") pod \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.421427 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-fernet-keys\") pod \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.421483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7l7r\" (UniqueName: \"kubernetes.io/projected/6162f46a-e25e-4bcf-8e84-77d28c565c1c-kube-api-access-m7l7r\") pod \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\" (UID: \"6162f46a-e25e-4bcf-8e84-77d28c565c1c\") " Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.433778 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6162f46a-e25e-4bcf-8e84-77d28c565c1c" (UID: "6162f46a-e25e-4bcf-8e84-77d28c565c1c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.434370 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6162f46a-e25e-4bcf-8e84-77d28c565c1c-kube-api-access-m7l7r" (OuterVolumeSpecName: "kube-api-access-m7l7r") pod "6162f46a-e25e-4bcf-8e84-77d28c565c1c" (UID: "6162f46a-e25e-4bcf-8e84-77d28c565c1c"). InnerVolumeSpecName "kube-api-access-m7l7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.436202 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6162f46a-e25e-4bcf-8e84-77d28c565c1c" (UID: "6162f46a-e25e-4bcf-8e84-77d28c565c1c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.436501 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-scripts" (OuterVolumeSpecName: "scripts") pod "6162f46a-e25e-4bcf-8e84-77d28c565c1c" (UID: "6162f46a-e25e-4bcf-8e84-77d28c565c1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.464739 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-config-data" (OuterVolumeSpecName: "config-data") pod "6162f46a-e25e-4bcf-8e84-77d28c565c1c" (UID: "6162f46a-e25e-4bcf-8e84-77d28c565c1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.480831 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6162f46a-e25e-4bcf-8e84-77d28c565c1c" (UID: "6162f46a-e25e-4bcf-8e84-77d28c565c1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.523796 4797 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.523818 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.523828 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.523838 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.523845 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6162f46a-e25e-4bcf-8e84-77d28c565c1c-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.523853 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7l7r\" (UniqueName: \"kubernetes.io/projected/6162f46a-e25e-4bcf-8e84-77d28c565c1c-kube-api-access-m7l7r\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.835043 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c47f4984-nxfz7"] Sep 30 18:02:14 crc kubenswrapper[4797]: W0930 18:02:14.848813 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd02f0fa_36e8_4676_802c_37127e022ad0.slice/crio-62f4d6ebc26e378a3954066c9d9388b455bb682d164619ada4ed515f73c1a404 WatchSource:0}: Error finding container 62f4d6ebc26e378a3954066c9d9388b455bb682d164619ada4ed515f73c1a404: Status 404 returned error can't find the container with id 62f4d6ebc26e378a3954066c9d9388b455bb682d164619ada4ed515f73c1a404 Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.883994 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7w9jn" event={"ID":"6162f46a-e25e-4bcf-8e84-77d28c565c1c","Type":"ContainerDied","Data":"dbf57f337765aaf465722aff3d0898be3e5eaf84178aabf29284c91a5a4e5121"} Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.884020 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7w9jn" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.884039 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf57f337765aaf465722aff3d0898be3e5eaf84178aabf29284c91a5a4e5121" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.888402 4797 generic.go:334] "Generic (PLEG): container finished" podID="592ad170-d90d-432f-a862-196f26e12d58" containerID="3e30b41d6dd6f435cf3ad2a5bd3b365151f30b61db5683352c9e20c9bfc92d79" exitCode=0 Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.888459 4797 generic.go:334] "Generic (PLEG): container finished" podID="592ad170-d90d-432f-a862-196f26e12d58" containerID="4da4408872f36a9a255e9ad81bc028da42b6c152b6104356d541f19f61fde6e2" exitCode=143 Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.888521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"592ad170-d90d-432f-a862-196f26e12d58","Type":"ContainerDied","Data":"3e30b41d6dd6f435cf3ad2a5bd3b365151f30b61db5683352c9e20c9bfc92d79"} Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.888552 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"592ad170-d90d-432f-a862-196f26e12d58","Type":"ContainerDied","Data":"4da4408872f36a9a255e9ad81bc028da42b6c152b6104356d541f19f61fde6e2"} Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.890382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c47f4984-nxfz7" event={"ID":"cd02f0fa-36e8-4676-802c-37127e022ad0","Type":"ContainerStarted","Data":"62f4d6ebc26e378a3954066c9d9388b455bb682d164619ada4ed515f73c1a404"} Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.901476 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b748fb867-znqws"] Sep 30 18:02:14 crc kubenswrapper[4797]: E0930 18:02:14.901819 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6162f46a-e25e-4bcf-8e84-77d28c565c1c" containerName="keystone-bootstrap" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.901834 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6162f46a-e25e-4bcf-8e84-77d28c565c1c" containerName="keystone-bootstrap" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.902030 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6162f46a-e25e-4bcf-8e84-77d28c565c1c" containerName="keystone-bootstrap" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.902612 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.906895 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.913155 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.913363 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8tsjk" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.913536 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.913647 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.913833 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.914687 4797 generic.go:334] "Generic (PLEG): container finished" podID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerID="62f2f44fc288da505a801b8b169a87091f0df8b405570b1cb41808d2473ddc86" exitCode=0 Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.914709 4797 generic.go:334] "Generic (PLEG): container finished" podID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerID="3a5ccafd363aa07a7ba63d2c87e4375de2bae4a4820ef0aba0e41a77ec9e3189" exitCode=143 Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.915508 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b12faf91-f7bd-4a27-9829-34a931141b7e","Type":"ContainerDied","Data":"62f2f44fc288da505a801b8b169a87091f0df8b405570b1cb41808d2473ddc86"} Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.915535 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b12faf91-f7bd-4a27-9829-34a931141b7e","Type":"ContainerDied","Data":"3a5ccafd363aa07a7ba63d2c87e4375de2bae4a4820ef0aba0e41a77ec9e3189"} Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.921938 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b748fb867-znqws"] Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936577 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-internal-tls-certs\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936678 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-credential-keys\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936719 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-fernet-keys\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-config-data\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936791 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-scripts\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2zkg\" (UniqueName: \"kubernetes.io/projected/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-kube-api-access-q2zkg\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.936993 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-public-tls-certs\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:14 crc kubenswrapper[4797]: I0930 18:02:14.937021 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-combined-ca-bundle\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.039778 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2zkg\" (UniqueName: \"kubernetes.io/projected/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-kube-api-access-q2zkg\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.039828 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-public-tls-certs\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.039864 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-combined-ca-bundle\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.039950 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-internal-tls-certs\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.040003 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-credential-keys\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.040068 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-fernet-keys\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.040090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-config-data\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.040149 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-scripts\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.043878 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-internal-tls-certs\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.047819 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-public-tls-certs\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.048262 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-scripts\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.048568 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-combined-ca-bundle\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.049825 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-credential-keys\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.054259 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-config-data\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.054463 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-fernet-keys\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.071683 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2zkg\" (UniqueName: \"kubernetes.io/projected/af775f0c-a3ef-4bd7-bf2e-cecdacda03ff-kube-api-access-q2zkg\") pod \"keystone-6b748fb867-znqws\" (UID: \"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff\") " pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.366935 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.410308 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.411595 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.582786 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.584032 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:02:15 crc kubenswrapper[4797]: I0930 18:02:15.913243 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b748fb867-znqws"] Sep 30 18:02:15 crc kubenswrapper[4797]: W0930 18:02:15.929091 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf775f0c_a3ef_4bd7_bf2e_cecdacda03ff.slice/crio-8a511814476c8f040abba877e77b2fff6aa1f80cbb69ece24aa29622a563c360 WatchSource:0}: Error finding container 8a511814476c8f040abba877e77b2fff6aa1f80cbb69ece24aa29622a563c360: Status 404 returned error can't find the container with id 8a511814476c8f040abba877e77b2fff6aa1f80cbb69ece24aa29622a563c360 Sep 30 18:02:16 crc kubenswrapper[4797]: E0930 18:02:16.604098 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:16 crc kubenswrapper[4797]: E0930 18:02:16.606267 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:16 crc kubenswrapper[4797]: E0930 18:02:16.609219 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:16 crc kubenswrapper[4797]: E0930 18:02:16.609330 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:16 crc kubenswrapper[4797]: I0930 18:02:16.943573 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b748fb867-znqws" event={"ID":"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff","Type":"ContainerStarted","Data":"8a511814476c8f040abba877e77b2fff6aa1f80cbb69ece24aa29622a563c360"} Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.471137 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.495995 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.963852 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b748fb867-znqws" event={"ID":"af775f0c-a3ef-4bd7-bf2e-cecdacda03ff","Type":"ContainerStarted","Data":"e31ecd034af2b95515547a23eaa57b91449b50278d3b4d0a5f3f9dbe14a38daa"} Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.963959 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.966804 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c47f4984-nxfz7" event={"ID":"cd02f0fa-36e8-4676-802c-37127e022ad0","Type":"ContainerStarted","Data":"de5880c2450ae956759e083881298f3e046d08d51497afcbaa8b13f71029793a"} Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.971988 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 18:02:17 crc kubenswrapper[4797]: I0930 18:02:17.996855 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b748fb867-znqws" podStartSLOduration=3.996837936 podStartE2EDuration="3.996837936s" podCreationTimestamp="2025-09-30 18:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:17.990568065 +0000 UTC m=+1188.513067303" watchObservedRunningTime="2025-09-30 18:02:17.996837936 +0000 UTC m=+1188.519337174" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.627242 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.637689 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.665566 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-httpd-run\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.666069 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-logs\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.666395 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-combined-ca-bundle\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.666533 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-config-data\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.666629 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.666712 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.666825 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-logs" (OuterVolumeSpecName: "logs") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.667088 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-scripts\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.667217 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24cxj\" (UniqueName: \"kubernetes.io/projected/b12faf91-f7bd-4a27-9829-34a931141b7e-kube-api-access-24cxj\") pod \"b12faf91-f7bd-4a27-9829-34a931141b7e\" (UID: \"b12faf91-f7bd-4a27-9829-34a931141b7e\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.667854 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.668026 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b12faf91-f7bd-4a27-9829-34a931141b7e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.702754 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-scripts" (OuterVolumeSpecName: "scripts") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.702850 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.704497 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12faf91-f7bd-4a27-9829-34a931141b7e-kube-api-access-24cxj" (OuterVolumeSpecName: "kube-api-access-24cxj") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "kube-api-access-24cxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.724244 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.772990 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-combined-ca-bundle\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.773090 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.773221 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nl5q\" (UniqueName: \"kubernetes.io/projected/592ad170-d90d-432f-a862-196f26e12d58-kube-api-access-9nl5q\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.773294 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-httpd-run\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.774072 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.774642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-scripts\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.774673 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-logs\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.774719 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-config-data\") pod \"592ad170-d90d-432f-a862-196f26e12d58\" (UID: \"592ad170-d90d-432f-a862-196f26e12d58\") " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.779500 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-logs" (OuterVolumeSpecName: "logs") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.786608 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.786629 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-config-data" (OuterVolumeSpecName: "config-data") pod "b12faf91-f7bd-4a27-9829-34a931141b7e" (UID: "b12faf91-f7bd-4a27-9829-34a931141b7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788179 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788242 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788257 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24cxj\" (UniqueName: \"kubernetes.io/projected/b12faf91-f7bd-4a27-9829-34a931141b7e-kube-api-access-24cxj\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788271 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788280 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/592ad170-d90d-432f-a862-196f26e12d58-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788288 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788297 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12faf91-f7bd-4a27-9829-34a931141b7e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.788310 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.789336 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-scripts" (OuterVolumeSpecName: "scripts") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.791692 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592ad170-d90d-432f-a862-196f26e12d58-kube-api-access-9nl5q" (OuterVolumeSpecName: "kube-api-access-9nl5q") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "kube-api-access-9nl5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.815516 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.815772 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api-log" containerID="cri-o://ab4cec5bc2c2a3091b7a39a4c36254544fc5c893fb1334ac5ac721112fc4464c" gracePeriod=30 Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.815978 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api" containerID="cri-o://25dc105cb75534d271a244678b5ef647b258f1830f9e257c231aa85b1a9cd98b" gracePeriod=30 Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.839904 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.849023 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.863264 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.887987 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-config-data" (OuterVolumeSpecName: "config-data") pod "592ad170-d90d-432f-a862-196f26e12d58" (UID: "592ad170-d90d-432f-a862-196f26e12d58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.889783 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.889807 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.889817 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.889826 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nl5q\" (UniqueName: \"kubernetes.io/projected/592ad170-d90d-432f-a862-196f26e12d58-kube-api-access-9nl5q\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.889839 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:20 crc kubenswrapper[4797]: I0930 18:02:20.889847 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592ad170-d90d-432f-a862-196f26e12d58-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.005666 4797 generic.go:334] "Generic (PLEG): container finished" podID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerID="ab4cec5bc2c2a3091b7a39a4c36254544fc5c893fb1334ac5ac721112fc4464c" exitCode=143 Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.005723 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b4daf73b-c17d-4879-b82a-c7eca9bafbab","Type":"ContainerDied","Data":"ab4cec5bc2c2a3091b7a39a4c36254544fc5c893fb1334ac5ac721112fc4464c"} Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.007469 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b12faf91-f7bd-4a27-9829-34a931141b7e","Type":"ContainerDied","Data":"b32c8738108811e1794a01a610b52d0330a1e22cba8536944316cfa51914c153"} Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.007502 4797 scope.go:117] "RemoveContainer" containerID="62f2f44fc288da505a801b8b169a87091f0df8b405570b1cb41808d2473ddc86" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.007604 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.024044 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"592ad170-d90d-432f-a862-196f26e12d58","Type":"ContainerDied","Data":"40c72747e3cc8d6063fb30d63b3528835819066938c56cbeb9654a466ab8cae0"} Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.024156 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.071530 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.089559 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.116399 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.131608 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.153918 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.154374 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-log" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154398 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-log" Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.154417 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-httpd" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154424 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-httpd" Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.154449 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-log" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154456 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-log" Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.154471 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-httpd" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154479 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-httpd" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154656 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-httpd" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154676 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-log" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154687 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="592ad170-d90d-432f-a862-196f26e12d58" containerName="glance-httpd" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.154696 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" containerName="glance-log" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.155662 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.157258 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.159162 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.159510 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n2zcg" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.159729 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.176880 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.178491 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.182603 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.182743 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.200354 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204497 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204584 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204659 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204711 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204743 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskfk\" (UniqueName: \"kubernetes.io/projected/168d0430-28b2-43f2-a5fe-f8b0c35cec53-kube-api-access-nskfk\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204802 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-logs\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204830 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-scripts\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.204913 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-config-data\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.219721 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306691 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306739 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306773 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306808 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306865 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306885 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nskfk\" (UniqueName: \"kubernetes.io/projected/168d0430-28b2-43f2-a5fe-f8b0c35cec53-kube-api-access-nskfk\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306907 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306933 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306954 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h927d\" (UniqueName: \"kubernetes.io/projected/7aa42c46-daf1-4414-9141-ff067cd3e2a2-kube-api-access-h927d\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.306983 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-logs\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307003 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307019 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-scripts\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307069 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307096 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-config-data\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307142 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307217 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307478 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.307769 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-logs\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.314941 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-scripts\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.320057 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.321803 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-config-data\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.326130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.326610 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskfk\" (UniqueName: \"kubernetes.io/projected/168d0430-28b2-43f2-a5fe-f8b0c35cec53-kube-api-access-nskfk\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.364121 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.408638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h927d\" (UniqueName: \"kubernetes.io/projected/7aa42c46-daf1-4414-9141-ff067cd3e2a2-kube-api-access-h927d\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.408748 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.408815 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.408886 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.408917 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.408948 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.409016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.409046 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.409623 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.411398 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.411559 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.415137 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.425262 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.429759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.432407 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.444954 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h927d\" (UniqueName: \"kubernetes.io/projected/7aa42c46-daf1-4414-9141-ff067cd3e2a2-kube-api-access-h927d\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.470638 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.475029 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.479822 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.507102 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.573093 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-g7g5h"] Sep 30 18:02:21 crc kubenswrapper[4797]: I0930 18:02:21.573569 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="dnsmasq-dns" containerID="cri-o://1bac15db7d92419dbf76c3bfb7311c2a869872823af2abfda90a9d591c86c9fd" gracePeriod=10 Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.609932 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.611482 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.613113 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:21 crc kubenswrapper[4797]: E0930 18:02:21.613154 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:22 crc kubenswrapper[4797]: I0930 18:02:22.041633 4797 generic.go:334] "Generic (PLEG): container finished" podID="ae7c614c-eec0-400f-8862-4a19e74046da" containerID="1bac15db7d92419dbf76c3bfb7311c2a869872823af2abfda90a9d591c86c9fd" exitCode=0 Sep 30 18:02:22 crc kubenswrapper[4797]: I0930 18:02:22.041689 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" event={"ID":"ae7c614c-eec0-400f-8862-4a19e74046da","Type":"ContainerDied","Data":"1bac15db7d92419dbf76c3bfb7311c2a869872823af2abfda90a9d591c86c9fd"} Sep 30 18:02:22 crc kubenswrapper[4797]: I0930 18:02:22.262733 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592ad170-d90d-432f-a862-196f26e12d58" path="/var/lib/kubelet/pods/592ad170-d90d-432f-a862-196f26e12d58/volumes" Sep 30 18:02:22 crc kubenswrapper[4797]: I0930 18:02:22.263840 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12faf91-f7bd-4a27-9829-34a931141b7e" path="/var/lib/kubelet/pods/b12faf91-f7bd-4a27-9829-34a931141b7e/volumes" Sep 30 18:02:22 crc kubenswrapper[4797]: I0930 18:02:22.470104 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Sep 30 18:02:23 crc kubenswrapper[4797]: I0930 18:02:23.350770 4797 scope.go:117] "RemoveContainer" containerID="3a5ccafd363aa07a7ba63d2c87e4375de2bae4a4820ef0aba0e41a77ec9e3189" Sep 30 18:02:23 crc kubenswrapper[4797]: I0930 18:02:23.994955 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": read tcp 10.217.0.2:57906->10.217.0.163:9322: read: connection reset by peer" Sep 30 18:02:23 crc kubenswrapper[4797]: I0930 18:02:23.996090 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": read tcp 10.217.0.2:57892->10.217.0.163:9322: read: connection reset by peer" Sep 30 18:02:24 crc kubenswrapper[4797]: I0930 18:02:24.053149 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:02:24 crc kubenswrapper[4797]: I0930 18:02:24.180491 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:02:24 crc kubenswrapper[4797]: I0930 18:02:24.788327 4797 scope.go:117] "RemoveContainer" containerID="3e30b41d6dd6f435cf3ad2a5bd3b365151f30b61db5683352c9e20c9bfc92d79" Sep 30 18:02:24 crc kubenswrapper[4797]: W0930 18:02:24.873471 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod168d0430_28b2_43f2_a5fe_f8b0c35cec53.slice/crio-1b041671028f619525d8dd798258a7c1492f259f4a760931b41ece6829df093f WatchSource:0}: Error finding container 1b041671028f619525d8dd798258a7c1492f259f4a760931b41ece6829df093f: Status 404 returned error can't find the container with id 1b041671028f619525d8dd798258a7c1492f259f4a760931b41ece6829df093f Sep 30 18:02:24 crc kubenswrapper[4797]: W0930 18:02:24.877572 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa42c46_daf1_4414_9141_ff067cd3e2a2.slice/crio-19f5604cbd2d0b5dba90679db050135f5d08afe79aa0a7697206840466bff82a WatchSource:0}: Error finding container 19f5604cbd2d0b5dba90679db050135f5d08afe79aa0a7697206840466bff82a: Status 404 returned error can't find the container with id 19f5604cbd2d0b5dba90679db050135f5d08afe79aa0a7697206840466bff82a Sep 30 18:02:24 crc kubenswrapper[4797]: I0930 18:02:24.985905 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.109913 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-config\") pod \"ae7c614c-eec0-400f-8862-4a19e74046da\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.109977 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-sb\") pod \"ae7c614c-eec0-400f-8862-4a19e74046da\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.110024 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-swift-storage-0\") pod \"ae7c614c-eec0-400f-8862-4a19e74046da\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.110118 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-svc\") pod \"ae7c614c-eec0-400f-8862-4a19e74046da\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.110140 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-nb\") pod \"ae7c614c-eec0-400f-8862-4a19e74046da\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.110277 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49crk\" (UniqueName: \"kubernetes.io/projected/ae7c614c-eec0-400f-8862-4a19e74046da-kube-api-access-49crk\") pod \"ae7c614c-eec0-400f-8862-4a19e74046da\" (UID: \"ae7c614c-eec0-400f-8862-4a19e74046da\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.131723 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7c614c-eec0-400f-8862-4a19e74046da-kube-api-access-49crk" (OuterVolumeSpecName: "kube-api-access-49crk") pod "ae7c614c-eec0-400f-8862-4a19e74046da" (UID: "ae7c614c-eec0-400f-8862-4a19e74046da"). InnerVolumeSpecName "kube-api-access-49crk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.139904 4797 generic.go:334] "Generic (PLEG): container finished" podID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerID="25dc105cb75534d271a244678b5ef647b258f1830f9e257c231aa85b1a9cd98b" exitCode=0 Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.139991 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b4daf73b-c17d-4879-b82a-c7eca9bafbab","Type":"ContainerDied","Data":"25dc105cb75534d271a244678b5ef647b258f1830f9e257c231aa85b1a9cd98b"} Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.199345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"168d0430-28b2-43f2-a5fe-f8b0c35cec53","Type":"ContainerStarted","Data":"1b041671028f619525d8dd798258a7c1492f259f4a760931b41ece6829df093f"} Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.205721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" event={"ID":"ae7c614c-eec0-400f-8862-4a19e74046da","Type":"ContainerDied","Data":"52734fa51073cad7cb3af688466539291f963fc561eadb5a20aa2204c2ff635e"} Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.205899 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-g7g5h" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.221631 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49crk\" (UniqueName: \"kubernetes.io/projected/ae7c614c-eec0-400f-8862-4a19e74046da-kube-api-access-49crk\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.243892 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae7c614c-eec0-400f-8862-4a19e74046da" (UID: "ae7c614c-eec0-400f-8862-4a19e74046da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.253638 4797 scope.go:117] "RemoveContainer" containerID="4da4408872f36a9a255e9ad81bc028da42b6c152b6104356d541f19f61fde6e2" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.259222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7aa42c46-daf1-4414-9141-ff067cd3e2a2","Type":"ContainerStarted","Data":"19f5604cbd2d0b5dba90679db050135f5d08afe79aa0a7697206840466bff82a"} Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.267737 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ae7c614c-eec0-400f-8862-4a19e74046da" (UID: "ae7c614c-eec0-400f-8862-4a19e74046da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.304269 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae7c614c-eec0-400f-8862-4a19e74046da" (UID: "ae7c614c-eec0-400f-8862-4a19e74046da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.317246 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-config" (OuterVolumeSpecName: "config") pod "ae7c614c-eec0-400f-8862-4a19e74046da" (UID: "ae7c614c-eec0-400f-8862-4a19e74046da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.320124 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae7c614c-eec0-400f-8862-4a19e74046da" (UID: "ae7c614c-eec0-400f-8862-4a19e74046da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.326046 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.326080 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.326093 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.326108 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.326118 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae7c614c-eec0-400f-8862-4a19e74046da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.391145 4797 scope.go:117] "RemoveContainer" containerID="1bac15db7d92419dbf76c3bfb7311c2a869872823af2abfda90a9d591c86c9fd" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.439973 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.459422 4797 scope.go:117] "RemoveContainer" containerID="e68a2e1054c5bf8d7d0a2bff511150b45276b6cf2e251155dffb6ca5004c9594" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.460626 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.558956 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-g7g5h"] Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.572840 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-g7g5h"] Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.591404 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6676d4ddcd-sxf6l" podUID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.635208 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-combined-ca-bundle\") pod \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.635720 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbvf8\" (UniqueName: \"kubernetes.io/projected/b4daf73b-c17d-4879-b82a-c7eca9bafbab-kube-api-access-kbvf8\") pod \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.636631 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-config-data\") pod \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.636661 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4daf73b-c17d-4879-b82a-c7eca9bafbab-logs\") pod \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.636714 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-custom-prometheus-ca\") pod \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\" (UID: \"b4daf73b-c17d-4879-b82a-c7eca9bafbab\") " Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.637427 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4daf73b-c17d-4879-b82a-c7eca9bafbab-logs" (OuterVolumeSpecName: "logs") pod "b4daf73b-c17d-4879-b82a-c7eca9bafbab" (UID: "b4daf73b-c17d-4879-b82a-c7eca9bafbab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.643221 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4daf73b-c17d-4879-b82a-c7eca9bafbab-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.654912 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4daf73b-c17d-4879-b82a-c7eca9bafbab-kube-api-access-kbvf8" (OuterVolumeSpecName: "kube-api-access-kbvf8") pod "b4daf73b-c17d-4879-b82a-c7eca9bafbab" (UID: "b4daf73b-c17d-4879-b82a-c7eca9bafbab"). InnerVolumeSpecName "kube-api-access-kbvf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.687251 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4daf73b-c17d-4879-b82a-c7eca9bafbab" (UID: "b4daf73b-c17d-4879-b82a-c7eca9bafbab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.718336 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b4daf73b-c17d-4879-b82a-c7eca9bafbab" (UID: "b4daf73b-c17d-4879-b82a-c7eca9bafbab"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.732550 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-config-data" (OuterVolumeSpecName: "config-data") pod "b4daf73b-c17d-4879-b82a-c7eca9bafbab" (UID: "b4daf73b-c17d-4879-b82a-c7eca9bafbab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.747361 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.747384 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbvf8\" (UniqueName: \"kubernetes.io/projected/b4daf73b-c17d-4879-b82a-c7eca9bafbab-kube-api-access-kbvf8\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.747399 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:25 crc kubenswrapper[4797]: I0930 18:02:25.747408 4797 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4daf73b-c17d-4879-b82a-c7eca9bafbab-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.281590 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" path="/var/lib/kubelet/pods/ae7c614c-eec0-400f-8862-4a19e74046da/volumes" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.302627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b4daf73b-c17d-4879-b82a-c7eca9bafbab","Type":"ContainerDied","Data":"f04df4542a24144659a5be64fb5300304deb7d9ac5c01fe97cca9950ae96ce5c"} Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.302690 4797 scope.go:117] "RemoveContainer" containerID="25dc105cb75534d271a244678b5ef647b258f1830f9e257c231aa85b1a9cd98b" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.302847 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.307396 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerStarted","Data":"e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b"} Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.313518 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c47f4984-nxfz7" event={"ID":"cd02f0fa-36e8-4676-802c-37127e022ad0","Type":"ContainerStarted","Data":"28ff3e89155060d80ee73f33a5b38386144ea085b2e458e7e477c3d3ca7c5e4e"} Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.314425 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.314651 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.333006 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hvw66" event={"ID":"81ba253b-ce64-4926-a8d3-1c8dd9dfef16","Type":"ContainerStarted","Data":"2bbb0045459939763b9fc5c7e459682d43046ab67bce2a3ba42bab69bac711d2"} Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.349530 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.373689 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.389339 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.389740 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.389753 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api" Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.389762 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="init" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.389768 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="init" Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.389779 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="dnsmasq-dns" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.389785 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="dnsmasq-dns" Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.389803 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api-log" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.389809 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api-log" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.389993 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.390004 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" containerName="watcher-api-log" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.390024 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7c614c-eec0-400f-8862-4a19e74046da" containerName="dnsmasq-dns" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.390494 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c47f4984-nxfz7" podStartSLOduration=13.390477111 podStartE2EDuration="13.390477111s" podCreationTimestamp="2025-09-30 18:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:26.361513041 +0000 UTC m=+1196.884012279" watchObservedRunningTime="2025-09-30 18:02:26.390477111 +0000 UTC m=+1196.912976349" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.390962 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.397228 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.397528 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.397679 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.417038 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hvw66" podStartSLOduration=3.5182699570000002 podStartE2EDuration="1m0.417019356s" podCreationTimestamp="2025-09-30 18:01:26 +0000 UTC" firstStartedPulling="2025-09-30 18:01:28.492618401 +0000 UTC m=+1139.015117629" lastFinishedPulling="2025-09-30 18:02:25.39136779 +0000 UTC m=+1195.913867028" observedRunningTime="2025-09-30 18:02:26.413822358 +0000 UTC m=+1196.936321596" watchObservedRunningTime="2025-09-30 18:02:26.417019356 +0000 UTC m=+1196.939518594" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.418476 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.423312 4797 scope.go:117] "RemoveContainer" containerID="ab4cec5bc2c2a3091b7a39a4c36254544fc5c893fb1334ac5ac721112fc4464c" Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.549521 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.551085 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.552300 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.552342 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="af798459-89f2-474d-9082-eee9e1712e86" containerName="watcher-decision-engine" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.571390 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg5pf\" (UniqueName: \"kubernetes.io/projected/84edad16-e218-42bd-bba3-77d16184436c-kube-api-access-hg5pf\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.571591 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.571698 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84edad16-e218-42bd-bba3-77d16184436c-logs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.571803 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-public-tls-certs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.571889 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.571978 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-config-data\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.572067 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.613871 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.616884 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.621153 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:26 crc kubenswrapper[4797]: E0930 18:02:26.621284 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.673878 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.673969 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg5pf\" (UniqueName: \"kubernetes.io/projected/84edad16-e218-42bd-bba3-77d16184436c-kube-api-access-hg5pf\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.673987 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.674041 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84edad16-e218-42bd-bba3-77d16184436c-logs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.674063 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-public-tls-certs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.674094 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.674134 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-config-data\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.675326 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84edad16-e218-42bd-bba3-77d16184436c-logs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.687048 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.687132 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.688142 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-public-tls-certs\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.691355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.692010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edad16-e218-42bd-bba3-77d16184436c-config-data\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.703383 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg5pf\" (UniqueName: \"kubernetes.io/projected/84edad16-e218-42bd-bba3-77d16184436c-kube-api-access-hg5pf\") pod \"watcher-api-0\" (UID: \"84edad16-e218-42bd-bba3-77d16184436c\") " pod="openstack/watcher-api-0" Sep 30 18:02:26 crc kubenswrapper[4797]: I0930 18:02:26.759922 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 18:02:27 crc kubenswrapper[4797]: I0930 18:02:27.377213 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 18:02:27 crc kubenswrapper[4797]: I0930 18:02:27.400124 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"168d0430-28b2-43f2-a5fe-f8b0c35cec53","Type":"ContainerStarted","Data":"64af618ad6e8219fd85aee8724a74aca56c8bfde45b9584e6df25bd31828cb8e"} Sep 30 18:02:27 crc kubenswrapper[4797]: I0930 18:02:27.404457 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7aa42c46-daf1-4414-9141-ff067cd3e2a2","Type":"ContainerStarted","Data":"7b99e91d80a68e7a3b3d90bf839e1e433c55613e00266315b8a324656d021a00"} Sep 30 18:02:27 crc kubenswrapper[4797]: I0930 18:02:27.407671 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zrb5" event={"ID":"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844","Type":"ContainerStarted","Data":"03ccb9d672dd73c73fff5a145ef9ae0c478edaa773275a5b56ae9caa10998c44"} Sep 30 18:02:27 crc kubenswrapper[4797]: I0930 18:02:27.455333 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2zrb5" podStartSLOduration=4.424863002 podStartE2EDuration="1m1.455304406s" podCreationTimestamp="2025-09-30 18:01:26 +0000 UTC" firstStartedPulling="2025-09-30 18:01:28.384308375 +0000 UTC m=+1138.906807613" lastFinishedPulling="2025-09-30 18:02:25.414749779 +0000 UTC m=+1195.937249017" observedRunningTime="2025-09-30 18:02:27.43786106 +0000 UTC m=+1197.960360308" watchObservedRunningTime="2025-09-30 18:02:27.455304406 +0000 UTC m=+1197.977803644" Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.457806 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.457770009 podStartE2EDuration="7.457770009s" podCreationTimestamp="2025-09-30 18:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:28.441581806 +0000 UTC m=+1198.964081054" watchObservedRunningTime="2025-09-30 18:02:28.457770009 +0000 UTC m=+1198.980269247" Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.461682 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4daf73b-c17d-4879-b82a-c7eca9bafbab" path="/var/lib/kubelet/pods/b4daf73b-c17d-4879-b82a-c7eca9bafbab/volumes" Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.466682 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"168d0430-28b2-43f2-a5fe-f8b0c35cec53","Type":"ContainerStarted","Data":"a5f82ab525bcec3ffb150b4bdb370e28f3f4f5b4ea32e797a0d7bbf00e064cfd"} Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.466719 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7aa42c46-daf1-4414-9141-ff067cd3e2a2","Type":"ContainerStarted","Data":"7cbf42b793fda441fe7d3a0dd7ca5499f2c3b15021f492441b9ed565fac10351"} Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.466734 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84edad16-e218-42bd-bba3-77d16184436c","Type":"ContainerStarted","Data":"d007ad22bb317925b2006416c78edb798dc628a06c52fb514753856eb7534c76"} Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.466750 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84edad16-e218-42bd-bba3-77d16184436c","Type":"ContainerStarted","Data":"abe41ceca59d00e7cc6abf0bd07b74d1aea75f7fef61cbb63919197b1ebe4111"} Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.484128 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.484106327 podStartE2EDuration="7.484106327s" podCreationTimestamp="2025-09-30 18:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:28.483949683 +0000 UTC m=+1199.006448921" watchObservedRunningTime="2025-09-30 18:02:28.484106327 +0000 UTC m=+1199.006605585" Sep 30 18:02:28 crc kubenswrapper[4797]: I0930 18:02:28.818148 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:29 crc kubenswrapper[4797]: I0930 18:02:29.442088 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"84edad16-e218-42bd-bba3-77d16184436c","Type":"ContainerStarted","Data":"af786bc6e647f7ce7760ae6d6d64e1d083a3f44269baa6165ce3e07d1db211cd"} Sep 30 18:02:29 crc kubenswrapper[4797]: I0930 18:02:29.469459 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.469441472 podStartE2EDuration="3.469441472s" podCreationTimestamp="2025-09-30 18:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:29.462115693 +0000 UTC m=+1199.984614951" watchObservedRunningTime="2025-09-30 18:02:29.469441472 +0000 UTC m=+1199.991940710" Sep 30 18:02:30 crc kubenswrapper[4797]: I0930 18:02:30.461760 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 18:02:30 crc kubenswrapper[4797]: I0930 18:02:30.463725 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="84edad16-e218-42bd-bba3-77d16184436c" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.175:9322/\": dial tcp 10.217.0.175:9322: connect: connection refused" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.476136 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.476651 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.508496 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.508551 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.518860 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.533386 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.560293 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.577162 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:31 crc kubenswrapper[4797]: E0930 18:02:31.603799 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:31 crc kubenswrapper[4797]: E0930 18:02:31.604971 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:31 crc kubenswrapper[4797]: E0930 18:02:31.607576 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:31 crc kubenswrapper[4797]: E0930 18:02:31.607618 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:31 crc kubenswrapper[4797]: I0930 18:02:31.760592 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 18:02:32 crc kubenswrapper[4797]: I0930 18:02:32.483568 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:32 crc kubenswrapper[4797]: I0930 18:02:32.483860 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:32 crc kubenswrapper[4797]: I0930 18:02:32.483874 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 18:02:32 crc kubenswrapper[4797]: I0930 18:02:32.483885 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 18:02:34 crc kubenswrapper[4797]: I0930 18:02:34.502029 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:02:34 crc kubenswrapper[4797]: I0930 18:02:34.502308 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:02:34 crc kubenswrapper[4797]: I0930 18:02:34.502258 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:02:34 crc kubenswrapper[4797]: I0930 18:02:34.502512 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:02:34 crc kubenswrapper[4797]: I0930 18:02:34.533405 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 18:02:35 crc kubenswrapper[4797]: I0930 18:02:35.583598 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6676d4ddcd-sxf6l" podUID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Sep 30 18:02:36 crc kubenswrapper[4797]: I0930 18:02:36.542538 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:36 crc kubenswrapper[4797]: E0930 18:02:36.612492 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:36 crc kubenswrapper[4797]: E0930 18:02:36.619995 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:36 crc kubenswrapper[4797]: E0930 18:02:36.621600 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 18:02:36 crc kubenswrapper[4797]: E0930 18:02:36.621645 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:36 crc kubenswrapper[4797]: I0930 18:02:36.761027 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 18:02:36 crc kubenswrapper[4797]: I0930 18:02:36.775415 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.502822 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.527523 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.527690 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.529423 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.560917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.608468 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.608825 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:02:37 crc kubenswrapper[4797]: I0930 18:02:37.637215 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 18:02:37 crc kubenswrapper[4797]: E0930 18:02:37.949022 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" Sep 30 18:02:38 crc kubenswrapper[4797]: I0930 18:02:38.553426 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerStarted","Data":"eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd"} Sep 30 18:02:38 crc kubenswrapper[4797]: I0930 18:02:38.553537 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="ceilometer-notification-agent" containerID="cri-o://6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf" gracePeriod=30 Sep 30 18:02:38 crc kubenswrapper[4797]: I0930 18:02:38.553672 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="proxy-httpd" containerID="cri-o://eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd" gracePeriod=30 Sep 30 18:02:38 crc kubenswrapper[4797]: I0930 18:02:38.553745 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="sg-core" containerID="cri-o://e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b" gracePeriod=30 Sep 30 18:02:38 crc kubenswrapper[4797]: I0930 18:02:38.554292 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.383825 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.567033 4797 generic.go:334] "Generic (PLEG): container finished" podID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerID="eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd" exitCode=0 Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.567068 4797 generic.go:334] "Generic (PLEG): container finished" podID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerID="e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b" exitCode=2 Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.567070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerDied","Data":"eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd"} Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.567119 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerDied","Data":"e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b"} Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.568733 4797 generic.go:334] "Generic (PLEG): container finished" podID="af798459-89f2-474d-9082-eee9e1712e86" containerID="554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88" exitCode=137 Sep 30 18:02:39 crc kubenswrapper[4797]: I0930 18:02:39.568774 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af798459-89f2-474d-9082-eee9e1712e86","Type":"ContainerDied","Data":"554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88"} Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.099491 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.106308 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217105 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-custom-prometheus-ca\") pod \"af798459-89f2-474d-9082-eee9e1712e86\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217195 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-logs\") pod \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217214 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2sgf\" (UniqueName: \"kubernetes.io/projected/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-kube-api-access-c2sgf\") pod \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217237 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af798459-89f2-474d-9082-eee9e1712e86-logs\") pod \"af798459-89f2-474d-9082-eee9e1712e86\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217770 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af798459-89f2-474d-9082-eee9e1712e86-logs" (OuterVolumeSpecName: "logs") pod "af798459-89f2-474d-9082-eee9e1712e86" (UID: "af798459-89f2-474d-9082-eee9e1712e86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217818 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-logs" (OuterVolumeSpecName: "logs") pod "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" (UID: "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217259 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-config-data\") pod \"af798459-89f2-474d-9082-eee9e1712e86\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217922 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-combined-ca-bundle\") pod \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217950 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-config-data\") pod \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\" (UID: \"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.217974 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-combined-ca-bundle\") pod \"af798459-89f2-474d-9082-eee9e1712e86\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.218111 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpc5r\" (UniqueName: \"kubernetes.io/projected/af798459-89f2-474d-9082-eee9e1712e86-kube-api-access-fpc5r\") pod \"af798459-89f2-474d-9082-eee9e1712e86\" (UID: \"af798459-89f2-474d-9082-eee9e1712e86\") " Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.218596 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.218613 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af798459-89f2-474d-9082-eee9e1712e86-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.221845 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-kube-api-access-c2sgf" (OuterVolumeSpecName: "kube-api-access-c2sgf") pod "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" (UID: "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a"). InnerVolumeSpecName "kube-api-access-c2sgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.223006 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af798459-89f2-474d-9082-eee9e1712e86-kube-api-access-fpc5r" (OuterVolumeSpecName: "kube-api-access-fpc5r") pod "af798459-89f2-474d-9082-eee9e1712e86" (UID: "af798459-89f2-474d-9082-eee9e1712e86"). InnerVolumeSpecName "kube-api-access-fpc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.258272 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" (UID: "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.266291 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "af798459-89f2-474d-9082-eee9e1712e86" (UID: "af798459-89f2-474d-9082-eee9e1712e86"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.269213 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af798459-89f2-474d-9082-eee9e1712e86" (UID: "af798459-89f2-474d-9082-eee9e1712e86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.273711 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-config-data" (OuterVolumeSpecName: "config-data") pod "af798459-89f2-474d-9082-eee9e1712e86" (UID: "af798459-89f2-474d-9082-eee9e1712e86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.292776 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-config-data" (OuterVolumeSpecName: "config-data") pod "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" (UID: "d82685eb-161c-4efc-b9d8-0bdb72ca4d0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320211 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpc5r\" (UniqueName: \"kubernetes.io/projected/af798459-89f2-474d-9082-eee9e1712e86-kube-api-access-fpc5r\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320244 4797 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320253 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2sgf\" (UniqueName: \"kubernetes.io/projected/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-kube-api-access-c2sgf\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320264 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320275 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320283 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.320291 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af798459-89f2-474d-9082-eee9e1712e86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.582357 4797 generic.go:334] "Generic (PLEG): container finished" podID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" exitCode=137 Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.582472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a","Type":"ContainerDied","Data":"4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310"} Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.582525 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d82685eb-161c-4efc-b9d8-0bdb72ca4d0a","Type":"ContainerDied","Data":"a211729f725d3d1b7a906984f85fee82253177dcffdd1ef20ebc92e9b750c786"} Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.582546 4797 scope.go:117] "RemoveContainer" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.584020 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.584468 4797 generic.go:334] "Generic (PLEG): container finished" podID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" containerID="2bbb0045459939763b9fc5c7e459682d43046ab67bce2a3ba42bab69bac711d2" exitCode=0 Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.584493 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hvw66" event={"ID":"81ba253b-ce64-4926-a8d3-1c8dd9dfef16","Type":"ContainerDied","Data":"2bbb0045459939763b9fc5c7e459682d43046ab67bce2a3ba42bab69bac711d2"} Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.586685 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af798459-89f2-474d-9082-eee9e1712e86","Type":"ContainerDied","Data":"b3d0eab54a57c1274ae921c03b8cf7e47400a2cc6a729299a747dfcdba267d9b"} Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.586776 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.614306 4797 scope.go:117] "RemoveContainer" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" Sep 30 18:02:40 crc kubenswrapper[4797]: E0930 18:02:40.618134 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310\": container with ID starting with 4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310 not found: ID does not exist" containerID="4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.618194 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310"} err="failed to get container status \"4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310\": rpc error: code = NotFound desc = could not find container \"4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310\": container with ID starting with 4fe76761b6f2c4d05a96dd6bbdd62b718912baf8fd519c886d95bee485226310 not found: ID does not exist" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.618236 4797 scope.go:117] "RemoveContainer" containerID="554d885cf2baf6265f707ec0686afc73fb0f6b19e57503110da7f8518fbece88" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.630129 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.644773 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.653633 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.663599 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.672808 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: E0930 18:02:40.673410 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af798459-89f2-474d-9082-eee9e1712e86" containerName="watcher-decision-engine" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.673503 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="af798459-89f2-474d-9082-eee9e1712e86" containerName="watcher-decision-engine" Sep 30 18:02:40 crc kubenswrapper[4797]: E0930 18:02:40.673585 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.673635 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.673872 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="af798459-89f2-474d-9082-eee9e1712e86" containerName="watcher-decision-engine" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.673953 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" containerName="watcher-applier" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.674633 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.678342 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.684151 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.691404 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.692598 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.696228 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.710127 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.717565 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.721623 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fc4d4d55c-fzms2" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.779284 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74f97446fb-hwg4l"] Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.779980 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74f97446fb-hwg4l" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-api" containerID="cri-o://c080b2266602c152fc73c4139b8eed2322891325c1ced93ea64e972e3f4266e5" gracePeriod=30 Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.780401 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74f97446fb-hwg4l" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-httpd" containerID="cri-o://df75fae43cfd850b1992da0675a8654efd4971a9b346815a07670649b0aeb8ba" gracePeriod=30 Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.829833 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.829884 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04de7be-0f64-475b-8f90-5fb466645c02-config-data\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.829934 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-logs\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.829974 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.829989 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a04de7be-0f64-475b-8f90-5fb466645c02-logs\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.830013 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04de7be-0f64-475b-8f90-5fb466645c02-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.830101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/a04de7be-0f64-475b-8f90-5fb466645c02-kube-api-access-lsqp4\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.830533 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qtc9\" (UniqueName: \"kubernetes.io/projected/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-kube-api-access-4qtc9\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.830689 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932140 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-logs\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932336 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932454 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a04de7be-0f64-475b-8f90-5fb466645c02-logs\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932550 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04de7be-0f64-475b-8f90-5fb466645c02-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932765 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/a04de7be-0f64-475b-8f90-5fb466645c02-kube-api-access-lsqp4\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932890 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qtc9\" (UniqueName: \"kubernetes.io/projected/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-kube-api-access-4qtc9\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.932989 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.933063 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.933156 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04de7be-0f64-475b-8f90-5fb466645c02-config-data\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.933524 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-logs\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.934413 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a04de7be-0f64-475b-8f90-5fb466645c02-logs\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.940097 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.940650 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a04de7be-0f64-475b-8f90-5fb466645c02-config-data\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.941759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a04de7be-0f64-475b-8f90-5fb466645c02-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.941970 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.950636 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.954855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qtc9\" (UniqueName: \"kubernetes.io/projected/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-kube-api-access-4qtc9\") pod \"watcher-decision-engine-0\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.959689 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsqp4\" (UniqueName: \"kubernetes.io/projected/a04de7be-0f64-475b-8f90-5fb466645c02-kube-api-access-lsqp4\") pod \"watcher-applier-0\" (UID: \"a04de7be-0f64-475b-8f90-5fb466645c02\") " pod="openstack/watcher-applier-0" Sep 30 18:02:40 crc kubenswrapper[4797]: I0930 18:02:40.993785 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.014833 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.523094 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 18:02:41 crc kubenswrapper[4797]: W0930 18:02:41.525626 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda04de7be_0f64_475b_8f90_5fb466645c02.slice/crio-d67a5c927782f504ff8edad6d67eafc95f670117b19fc55a90d0d848065b8fe1 WatchSource:0}: Error finding container d67a5c927782f504ff8edad6d67eafc95f670117b19fc55a90d0d848065b8fe1: Status 404 returned error can't find the container with id d67a5c927782f504ff8edad6d67eafc95f670117b19fc55a90d0d848065b8fe1 Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.593773 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.606820 4797 generic.go:334] "Generic (PLEG): container finished" podID="2152a741-6045-4223-8d2a-9a1c24191d99" containerID="df75fae43cfd850b1992da0675a8654efd4971a9b346815a07670649b0aeb8ba" exitCode=0 Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.606971 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74f97446fb-hwg4l" event={"ID":"2152a741-6045-4223-8d2a-9a1c24191d99","Type":"ContainerDied","Data":"df75fae43cfd850b1992da0675a8654efd4971a9b346815a07670649b0aeb8ba"} Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.611143 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a04de7be-0f64-475b-8f90-5fb466645c02","Type":"ContainerStarted","Data":"d67a5c927782f504ff8edad6d67eafc95f670117b19fc55a90d0d848065b8fe1"} Sep 30 18:02:41 crc kubenswrapper[4797]: W0930 18:02:41.612503 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f9c8682_0d0e_4fca_b9a2_95191f7ec66b.slice/crio-9a7b641e62b0a0aac8cd2bfa1272c1da3896255511029c9a6bde3ffbc59a4b6b WatchSource:0}: Error finding container 9a7b641e62b0a0aac8cd2bfa1272c1da3896255511029c9a6bde3ffbc59a4b6b: Status 404 returned error can't find the container with id 9a7b641e62b0a0aac8cd2bfa1272c1da3896255511029c9a6bde3ffbc59a4b6b Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.612874 4797 generic.go:334] "Generic (PLEG): container finished" podID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" containerID="03ccb9d672dd73c73fff5a145ef9ae0c478edaa773275a5b56ae9caa10998c44" exitCode=0 Sep 30 18:02:41 crc kubenswrapper[4797]: I0930 18:02:41.614891 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zrb5" event={"ID":"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844","Type":"ContainerDied","Data":"03ccb9d672dd73c73fff5a145ef9ae0c478edaa773275a5b56ae9caa10998c44"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.038781 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hvw66" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.161206 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-db-sync-config-data\") pod \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.161300 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbpnz\" (UniqueName: \"kubernetes.io/projected/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-kube-api-access-vbpnz\") pod \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.161425 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-combined-ca-bundle\") pod \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\" (UID: \"81ba253b-ce64-4926-a8d3-1c8dd9dfef16\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.167525 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-kube-api-access-vbpnz" (OuterVolumeSpecName: "kube-api-access-vbpnz") pod "81ba253b-ce64-4926-a8d3-1c8dd9dfef16" (UID: "81ba253b-ce64-4926-a8d3-1c8dd9dfef16"). InnerVolumeSpecName "kube-api-access-vbpnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.167566 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "81ba253b-ce64-4926-a8d3-1c8dd9dfef16" (UID: "81ba253b-ce64-4926-a8d3-1c8dd9dfef16"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.190137 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ba253b-ce64-4926-a8d3-1c8dd9dfef16" (UID: "81ba253b-ce64-4926-a8d3-1c8dd9dfef16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.249060 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af798459-89f2-474d-9082-eee9e1712e86" path="/var/lib/kubelet/pods/af798459-89f2-474d-9082-eee9e1712e86/volumes" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.249987 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82685eb-161c-4efc-b9d8-0bdb72ca4d0a" path="/var/lib/kubelet/pods/d82685eb-161c-4efc-b9d8-0bdb72ca4d0a/volumes" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.264886 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.264918 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbpnz\" (UniqueName: \"kubernetes.io/projected/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-kube-api-access-vbpnz\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.264930 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ba253b-ce64-4926-a8d3-1c8dd9dfef16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.629384 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.666067 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hvw66" event={"ID":"81ba253b-ce64-4926-a8d3-1c8dd9dfef16","Type":"ContainerDied","Data":"adf512f3ac3be8dfb0649eeee84fc8479f6cd8039e16cd4298cdd41625d36576"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.666104 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf512f3ac3be8dfb0649eeee84fc8479f6cd8039e16cd4298cdd41625d36576" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.666167 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hvw66" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.668125 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a04de7be-0f64-475b-8f90-5fb466645c02","Type":"ContainerStarted","Data":"ff756eacca6b716e77f127cd78fe87d2a06567608a301b24595526aaa478796d"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671159 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-scripts\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671241 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-run-httpd\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671295 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-log-httpd\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671342 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-sg-core-conf-yaml\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671421 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4lsr\" (UniqueName: \"kubernetes.io/projected/4b69b34b-4d04-4a75-86cc-62cc21727907-kube-api-access-w4lsr\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671456 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-config-data\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.671481 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-combined-ca-bundle\") pod \"4b69b34b-4d04-4a75-86cc-62cc21727907\" (UID: \"4b69b34b-4d04-4a75-86cc-62cc21727907\") " Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.673839 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.674699 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b","Type":"ContainerStarted","Data":"16815b620a0c0bde09222225a3d16bf86d63b745e21f7334c4cf084e5dd9d911"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.674735 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b","Type":"ContainerStarted","Data":"9a7b641e62b0a0aac8cd2bfa1272c1da3896255511029c9a6bde3ffbc59a4b6b"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.676682 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-scripts" (OuterVolumeSpecName: "scripts") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.677681 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.677893 4797 generic.go:334] "Generic (PLEG): container finished" podID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerID="6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf" exitCode=0 Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.678023 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.678095 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerDied","Data":"6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.678132 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b69b34b-4d04-4a75-86cc-62cc21727907","Type":"ContainerDied","Data":"c6e437669840d6e719bf83c5a82adb532f146d92a20eb2743750bcad96694093"} Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.678178 4797 scope.go:117] "RemoveContainer" containerID="eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.685205 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b69b34b-4d04-4a75-86cc-62cc21727907-kube-api-access-w4lsr" (OuterVolumeSpecName: "kube-api-access-w4lsr") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "kube-api-access-w4lsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.692375 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.692355713 podStartE2EDuration="2.692355713s" podCreationTimestamp="2025-09-30 18:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:42.688116888 +0000 UTC m=+1213.210616126" watchObservedRunningTime="2025-09-30 18:02:42.692355713 +0000 UTC m=+1213.214854951" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.712952 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.712933605 podStartE2EDuration="2.712933605s" podCreationTimestamp="2025-09-30 18:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:42.705458121 +0000 UTC m=+1213.227957369" watchObservedRunningTime="2025-09-30 18:02:42.712933605 +0000 UTC m=+1213.235432844" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.713427 4797 scope.go:117] "RemoveContainer" containerID="e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.714261 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.744187 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.768208 4797 scope.go:117] "RemoveContainer" containerID="6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.769644 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-config-data" (OuterVolumeSpecName: "config-data") pod "4b69b34b-4d04-4a75-86cc-62cc21727907" (UID: "4b69b34b-4d04-4a75-86cc-62cc21727907"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773476 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773500 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b69b34b-4d04-4a75-86cc-62cc21727907-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773509 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773519 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4lsr\" (UniqueName: \"kubernetes.io/projected/4b69b34b-4d04-4a75-86cc-62cc21727907-kube-api-access-w4lsr\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773528 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773537 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.773544 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b69b34b-4d04-4a75-86cc-62cc21727907-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.807702 4797 scope.go:117] "RemoveContainer" containerID="eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd" Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.834622 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd\": container with ID starting with eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd not found: ID does not exist" containerID="eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.834682 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd"} err="failed to get container status \"eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd\": rpc error: code = NotFound desc = could not find container \"eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd\": container with ID starting with eb76f94fd0ef668a0eefaebfb26e8fc3b34f3876245eda9086c905c7aa9fe3bd not found: ID does not exist" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.834710 4797 scope.go:117] "RemoveContainer" containerID="e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b" Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.836993 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b\": container with ID starting with e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b not found: ID does not exist" containerID="e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.837025 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b"} err="failed to get container status \"e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b\": rpc error: code = NotFound desc = could not find container \"e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b\": container with ID starting with e3f7b98e66dae97da8d8f75fac891e4413a5fb2ba83bccd6c8765062a027735b not found: ID does not exist" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.837046 4797 scope.go:117] "RemoveContainer" containerID="6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf" Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.845609 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf\": container with ID starting with 6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf not found: ID does not exist" containerID="6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.845652 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf"} err="failed to get container status \"6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf\": rpc error: code = NotFound desc = could not find container \"6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf\": container with ID starting with 6bded12e79fc30ff269e4b916a808a537500124eb69b0a9ac7ad3130a60773cf not found: ID does not exist" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.859505 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5768c5b854-k959d"] Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.859854 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" containerName="barbican-db-sync" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.859870 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" containerName="barbican-db-sync" Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.859890 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="proxy-httpd" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.859897 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="proxy-httpd" Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.859907 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="sg-core" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.859913 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="sg-core" Sep 30 18:02:42 crc kubenswrapper[4797]: E0930 18:02:42.859927 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="ceilometer-notification-agent" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.859932 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="ceilometer-notification-agent" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.860097 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="proxy-httpd" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.860110 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="ceilometer-notification-agent" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.860130 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" containerName="sg-core" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.860141 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" containerName="barbican-db-sync" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.861020 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.874726 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.878995 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6k5z6" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.883240 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5768c5b854-k959d"] Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.885785 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.984670 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-logs\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.984734 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-config-data\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.984770 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-config-data-custom\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.984803 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzx2\" (UniqueName: \"kubernetes.io/projected/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-kube-api-access-wvzx2\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:42 crc kubenswrapper[4797]: I0930 18:02:42.984870 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-combined-ca-bundle\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.058021 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b75d759d5-6bwm5"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.059794 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.066396 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.075505 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b75d759d5-6bwm5"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087555 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-combined-ca-bundle\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087617 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-config-data\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087650 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-config-data-custom\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087688 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-combined-ca-bundle\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087721 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/080f211c-e410-4f16-af62-78ce0d6d9d26-logs\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087741 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktpw\" (UniqueName: \"kubernetes.io/projected/080f211c-e410-4f16-af62-78ce0d6d9d26-kube-api-access-pktpw\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087786 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-logs\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087812 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-config-data\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087831 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-config-data-custom\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.087854 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzx2\" (UniqueName: \"kubernetes.io/projected/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-kube-api-access-wvzx2\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.095786 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rgtlk"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.098342 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.100480 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-logs\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.112406 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-config-data-custom\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.113398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-config-data\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.132962 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rgtlk"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.140169 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-combined-ca-bundle\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.156575 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzx2\" (UniqueName: \"kubernetes.io/projected/9c9236f9-becb-4d5c-aeb5-56a3b0547c86-kube-api-access-wvzx2\") pod \"barbican-keystone-listener-5768c5b854-k959d\" (UID: \"9c9236f9-becb-4d5c-aeb5-56a3b0547c86\") " pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190782 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-config-data\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190809 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-config-data-custom\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190825 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-config\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190842 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5t4t\" (UniqueName: \"kubernetes.io/projected/7a254b0d-9c02-4245-89e1-166a052becfb-kube-api-access-d5t4t\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190879 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190904 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-combined-ca-bundle\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190931 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/080f211c-e410-4f16-af62-78ce0d6d9d26-logs\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190966 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktpw\" (UniqueName: \"kubernetes.io/projected/080f211c-e410-4f16-af62-78ce0d6d9d26-kube-api-access-pktpw\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.190991 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.197976 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/080f211c-e410-4f16-af62-78ce0d6d9d26-logs\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.202829 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5768c5b854-k959d" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.216060 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-config-data-custom\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.220791 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-config-data\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.224102 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080f211c-e410-4f16-af62-78ce0d6d9d26-combined-ca-bundle\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.228888 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktpw\" (UniqueName: \"kubernetes.io/projected/080f211c-e410-4f16-af62-78ce0d6d9d26-kube-api-access-pktpw\") pod \"barbican-worker-7b75d759d5-6bwm5\" (UID: \"080f211c-e410-4f16-af62-78ce0d6d9d26\") " pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.261905 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.300704 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-config\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.300942 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5t4t\" (UniqueName: \"kubernetes.io/projected/7a254b0d-9c02-4245-89e1-166a052becfb-kube-api-access-d5t4t\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.301027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.301126 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.301204 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.301343 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.302584 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.303149 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-config\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.305900 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.306415 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.306930 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.311494 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.331149 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b75d759d5-6bwm5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.383341 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5t4t\" (UniqueName: \"kubernetes.io/projected/7a254b0d-9c02-4245-89e1-166a052becfb-kube-api-access-d5t4t\") pod \"dnsmasq-dns-75c8ddd69c-rgtlk\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.392409 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bbffffb8-2h9zs"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.395282 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.400943 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.449739 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.476968 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.494006 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.497215 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.498579 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.520005 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.520427 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data-custom\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.521821 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0b8e42-729d-430b-a9bf-99d3e949e06d-logs\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.521865 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85zhg\" (UniqueName: \"kubernetes.io/projected/7a0b8e42-729d-430b-a9bf-99d3e949e06d-kube-api-access-85zhg\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.522035 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-combined-ca-bundle\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.568099 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbffffb8-2h9zs"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.606371 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.623485 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbhj\" (UniqueName: \"kubernetes.io/projected/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-kube-api-access-7mbhj\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.623546 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data-custom\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.623575 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-log-httpd\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.623821 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0b8e42-729d-430b-a9bf-99d3e949e06d-logs\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.623875 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-config-data\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.623910 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85zhg\" (UniqueName: \"kubernetes.io/projected/7a0b8e42-729d-430b-a9bf-99d3e949e06d-kube-api-access-85zhg\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624031 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624053 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-run-httpd\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624079 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-combined-ca-bundle\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624109 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-scripts\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624180 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624214 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.624738 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0b8e42-729d-430b-a9bf-99d3e949e06d-logs\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.632252 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.633174 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data-custom\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.636329 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-combined-ca-bundle\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.637531 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.650298 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85zhg\" (UniqueName: \"kubernetes.io/projected/7a0b8e42-729d-430b-a9bf-99d3e949e06d-kube-api-access-85zhg\") pod \"barbican-api-5bbffffb8-2h9zs\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.718956 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zrb5" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.719339 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zrb5" event={"ID":"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844","Type":"ContainerDied","Data":"72adee500a99840328dcb4d96d6fac93d3c5a92e61c8929bcaf22da7b9a39983"} Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.719358 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72adee500a99840328dcb4d96d6fac93d3c5a92e61c8929bcaf22da7b9a39983" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.740698 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-combined-ca-bundle\") pod \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.740903 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6sbm\" (UniqueName: \"kubernetes.io/projected/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-kube-api-access-f6sbm\") pod \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.740957 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-config-data\") pod \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.741110 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-etc-machine-id\") pod \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.741255 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-db-sync-config-data\") pod \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.741293 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-scripts\") pod \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\" (UID: \"7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844\") " Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742163 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-config-data\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742255 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742281 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-run-httpd\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742296 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" (UID: "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742312 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-scripts\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742516 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742757 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbhj\" (UniqueName: \"kubernetes.io/projected/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-kube-api-access-7mbhj\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742882 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-log-httpd\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.742966 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.743489 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-log-httpd\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.744211 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-kube-api-access-f6sbm" (OuterVolumeSpecName: "kube-api-access-f6sbm") pod "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" (UID: "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844"). InnerVolumeSpecName "kube-api-access-f6sbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.746141 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-run-httpd\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.750846 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" (UID: "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.750938 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-scripts" (OuterVolumeSpecName: "scripts") pod "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" (UID: "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.761393 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-scripts\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.764085 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-config-data\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.768803 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.769545 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.770540 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbhj\" (UniqueName: \"kubernetes.io/projected/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-kube-api-access-7mbhj\") pod \"ceilometer-0\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.771235 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5768c5b854-k959d"] Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.777035 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.814540 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" (UID: "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.837034 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.837686 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-config-data" (OuterVolumeSpecName: "config-data") pod "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" (UID: "7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.845014 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.845037 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.845049 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6sbm\" (UniqueName: \"kubernetes.io/projected/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-kube-api-access-f6sbm\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.845163 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:43 crc kubenswrapper[4797]: I0930 18:02:43.846644 4797 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.032301 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b75d759d5-6bwm5"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.144815 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rgtlk"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.223886 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c47f4984-nxfz7" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.288999 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b69b34b-4d04-4a75-86cc-62cc21727907" path="/var/lib/kubelet/pods/4b69b34b-4d04-4a75-86cc-62cc21727907/volumes" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.413350 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbffffb8-2h9zs"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.593341 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.739879 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b75d759d5-6bwm5" event={"ID":"080f211c-e410-4f16-af62-78ce0d6d9d26","Type":"ContainerStarted","Data":"6c775efc368c0b4157032f0eed803c5d278f03f93b81d60665b7715de555d85b"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.742767 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerStarted","Data":"4ed8c737d78a9cd88b2e314375229ac1529b412e1cc95b7de6203300c0c33b38"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.746317 4797 generic.go:334] "Generic (PLEG): container finished" podID="7a254b0d-9c02-4245-89e1-166a052becfb" containerID="07aec71389677ee2f119d654e20546ab8f91f269226735869c5683e94e2e7a5a" exitCode=0 Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.746375 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" event={"ID":"7a254b0d-9c02-4245-89e1-166a052becfb","Type":"ContainerDied","Data":"07aec71389677ee2f119d654e20546ab8f91f269226735869c5683e94e2e7a5a"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.746392 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" event={"ID":"7a254b0d-9c02-4245-89e1-166a052becfb","Type":"ContainerStarted","Data":"611dd46a949b98f97b50901e13575fe3e174b9376af7921c0c5256bb21a826d3"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.754482 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5768c5b854-k959d" event={"ID":"9c9236f9-becb-4d5c-aeb5-56a3b0547c86","Type":"ContainerStarted","Data":"1b351899a918458198ff8e417554374f5a93cfa2f3db3ce7f327c3641cfdff3c"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.757926 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbffffb8-2h9zs" event={"ID":"7a0b8e42-729d-430b-a9bf-99d3e949e06d","Type":"ContainerStarted","Data":"8415d534e865748dd514b8e93227d1586e5e9892fb6d2447d1aae16090822896"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.757955 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbffffb8-2h9zs" event={"ID":"7a0b8e42-729d-430b-a9bf-99d3e949e06d","Type":"ContainerStarted","Data":"299b24373bf5385e853cfb7f21d1d8a2381f6e49987375dde74a5986e0a02768"} Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.935507 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:02:44 crc kubenswrapper[4797]: E0930 18:02:44.936066 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" containerName="cinder-db-sync" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.936083 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" containerName="cinder-db-sync" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.936351 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" containerName="cinder-db-sync" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.937679 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.939825 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pwstb" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.940706 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.942057 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.953893 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.957464 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.973266 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rgtlk"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.983612 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-msj2h"] Sep 30 18:02:44 crc kubenswrapper[4797]: I0930 18:02:44.985048 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.054283 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-msj2h"] Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.078287 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d89f\" (UniqueName: \"kubernetes.io/projected/19d21904-7699-4ee9-95e5-17c1621b2f6b-kube-api-access-7d89f\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081101 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-config\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081202 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081248 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19d21904-7699-4ee9-95e5-17c1621b2f6b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081315 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rf6r\" (UniqueName: \"kubernetes.io/projected/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-kube-api-access-2rf6r\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081386 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081420 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-scripts\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081521 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081593 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081656 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081708 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.081752 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.157870 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.159508 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.167070 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.168890 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186113 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d89f\" (UniqueName: \"kubernetes.io/projected/19d21904-7699-4ee9-95e5-17c1621b2f6b-kube-api-access-7d89f\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186160 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-config\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186233 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19d21904-7699-4ee9-95e5-17c1621b2f6b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186268 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rf6r\" (UniqueName: \"kubernetes.io/projected/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-kube-api-access-2rf6r\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186305 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-scripts\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186378 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186413 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186464 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186497 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.186520 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.210407 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-scripts\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.211685 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-config\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: E0930 18:02:45.212974 4797 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 30 18:02:45 crc kubenswrapper[4797]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/7a254b0d-9c02-4245-89e1-166a052becfb/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 18:02:45 crc kubenswrapper[4797]: > podSandboxID="611dd46a949b98f97b50901e13575fe3e174b9376af7921c0c5256bb21a826d3" Sep 30 18:02:45 crc kubenswrapper[4797]: E0930 18:02:45.214104 4797 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 30 18:02:45 crc kubenswrapper[4797]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h86hd4h5f9hc8h599h5h56bh75h554h597h5f4hb7h98h58fh66ch57ch668h5bfhd8h596h68dh54h8ch674h587h5bdhb9hc4h695h5b8hccq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5t4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75c8ddd69c-rgtlk_openstack(7a254b0d-9c02-4245-89e1-166a052becfb): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/7a254b0d-9c02-4245-89e1-166a052becfb/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 18:02:45 crc kubenswrapper[4797]: > logger="UnhandledError" Sep 30 18:02:45 crc kubenswrapper[4797]: E0930 18:02:45.217458 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/7a254b0d-9c02-4245-89e1-166a052becfb/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" podUID="7a254b0d-9c02-4245-89e1-166a052becfb" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.218457 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19d21904-7699-4ee9-95e5-17c1621b2f6b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.218909 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.219171 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.222131 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.222051 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.222955 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.226228 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.240390 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d89f\" (UniqueName: \"kubernetes.io/projected/19d21904-7699-4ee9-95e5-17c1621b2f6b-kube-api-access-7d89f\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.242288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data\") pod \"cinder-scheduler-0\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.260251 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rf6r\" (UniqueName: \"kubernetes.io/projected/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-kube-api-access-2rf6r\") pod \"dnsmasq-dns-5784cf869f-msj2h\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.311193 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-scripts\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.311260 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data-custom\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.311316 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.311361 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.311389 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.311425 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-logs\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.315729 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78kv\" (UniqueName: \"kubernetes.io/projected/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-kube-api-access-m78kv\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.365919 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.401886 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.416980 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-scripts\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.417140 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data-custom\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.417172 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.417205 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.417222 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.417245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-logs\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.417276 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78kv\" (UniqueName: \"kubernetes.io/projected/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-kube-api-access-m78kv\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.418374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.419249 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-logs\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.425422 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.431529 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-scripts\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.441815 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.445068 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data-custom\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.451970 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78kv\" (UniqueName: \"kubernetes.io/projected/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-kube-api-access-m78kv\") pod \"cinder-api-0\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.749877 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.876393 4797 generic.go:334] "Generic (PLEG): container finished" podID="2152a741-6045-4223-8d2a-9a1c24191d99" containerID="c080b2266602c152fc73c4139b8eed2322891325c1ced93ea64e972e3f4266e5" exitCode=0 Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.876663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74f97446fb-hwg4l" event={"ID":"2152a741-6045-4223-8d2a-9a1c24191d99","Type":"ContainerDied","Data":"c080b2266602c152fc73c4139b8eed2322891325c1ced93ea64e972e3f4266e5"} Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.920997 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbffffb8-2h9zs" event={"ID":"7a0b8e42-729d-430b-a9bf-99d3e949e06d","Type":"ContainerStarted","Data":"c6cf3c12ebb6936a344d9193d65cfe2fdeef6a522bdcdca83805e60e49e579ea"} Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.921036 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.921358 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:45 crc kubenswrapper[4797]: I0930 18:02:45.984084 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bbffffb8-2h9zs" podStartSLOduration=2.984068681 podStartE2EDuration="2.984068681s" podCreationTimestamp="2025-09-30 18:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:45.981964524 +0000 UTC m=+1216.504463752" watchObservedRunningTime="2025-09-30 18:02:45.984068681 +0000 UTC m=+1216.506567919" Sep 30 18:02:46 crc kubenswrapper[4797]: I0930 18:02:46.018540 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.321844 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.329547 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479604 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-config\") pod \"7a254b0d-9c02-4245-89e1-166a052becfb\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479686 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-httpd-config\") pod \"2152a741-6045-4223-8d2a-9a1c24191d99\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479710 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-config\") pod \"2152a741-6045-4223-8d2a-9a1c24191d99\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-swift-storage-0\") pod \"7a254b0d-9c02-4245-89e1-166a052becfb\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479777 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5t4t\" (UniqueName: \"kubernetes.io/projected/7a254b0d-9c02-4245-89e1-166a052becfb-kube-api-access-d5t4t\") pod \"7a254b0d-9c02-4245-89e1-166a052becfb\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-ovndb-tls-certs\") pod \"2152a741-6045-4223-8d2a-9a1c24191d99\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479894 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-sb\") pod \"7a254b0d-9c02-4245-89e1-166a052becfb\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.479962 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-svc\") pod \"7a254b0d-9c02-4245-89e1-166a052becfb\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.480036 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-combined-ca-bundle\") pod \"2152a741-6045-4223-8d2a-9a1c24191d99\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.480095 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-nb\") pod \"7a254b0d-9c02-4245-89e1-166a052becfb\" (UID: \"7a254b0d-9c02-4245-89e1-166a052becfb\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.480183 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qt25\" (UniqueName: \"kubernetes.io/projected/2152a741-6045-4223-8d2a-9a1c24191d99-kube-api-access-2qt25\") pod \"2152a741-6045-4223-8d2a-9a1c24191d99\" (UID: \"2152a741-6045-4223-8d2a-9a1c24191d99\") " Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.494911 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2152a741-6045-4223-8d2a-9a1c24191d99-kube-api-access-2qt25" (OuterVolumeSpecName: "kube-api-access-2qt25") pod "2152a741-6045-4223-8d2a-9a1c24191d99" (UID: "2152a741-6045-4223-8d2a-9a1c24191d99"). InnerVolumeSpecName "kube-api-access-2qt25". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.509651 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a254b0d-9c02-4245-89e1-166a052becfb-kube-api-access-d5t4t" (OuterVolumeSpecName: "kube-api-access-d5t4t") pod "7a254b0d-9c02-4245-89e1-166a052becfb" (UID: "7a254b0d-9c02-4245-89e1-166a052becfb"). InnerVolumeSpecName "kube-api-access-d5t4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.509682 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2152a741-6045-4223-8d2a-9a1c24191d99" (UID: "2152a741-6045-4223-8d2a-9a1c24191d99"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.582361 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qt25\" (UniqueName: \"kubernetes.io/projected/2152a741-6045-4223-8d2a-9a1c24191d99-kube-api-access-2qt25\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.582386 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.582395 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5t4t\" (UniqueName: \"kubernetes.io/projected/7a254b0d-9c02-4245-89e1-166a052becfb-kube-api-access-d5t4t\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.628686 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a254b0d-9c02-4245-89e1-166a052becfb" (UID: "7a254b0d-9c02-4245-89e1-166a052becfb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.684165 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.705131 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-config" (OuterVolumeSpecName: "config") pod "2152a741-6045-4223-8d2a-9a1c24191d99" (UID: "2152a741-6045-4223-8d2a-9a1c24191d99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.730708 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a254b0d-9c02-4245-89e1-166a052becfb" (UID: "7a254b0d-9c02-4245-89e1-166a052becfb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.733663 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2152a741-6045-4223-8d2a-9a1c24191d99" (UID: "2152a741-6045-4223-8d2a-9a1c24191d99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.736320 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a254b0d-9c02-4245-89e1-166a052becfb" (UID: "7a254b0d-9c02-4245-89e1-166a052becfb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.741844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a254b0d-9c02-4245-89e1-166a052becfb" (UID: "7a254b0d-9c02-4245-89e1-166a052becfb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.784872 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-config" (OuterVolumeSpecName: "config") pod "7a254b0d-9c02-4245-89e1-166a052becfb" (UID: "7a254b0d-9c02-4245-89e1-166a052becfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.786148 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.786163 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.786172 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.786183 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.786191 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a254b0d-9c02-4245-89e1-166a052becfb-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.786200 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.806632 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2152a741-6045-4223-8d2a-9a1c24191d99" (UID: "2152a741-6045-4223-8d2a-9a1c24191d99"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.887759 4797 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2152a741-6045-4223-8d2a-9a1c24191d99-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.947477 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74f97446fb-hwg4l" event={"ID":"2152a741-6045-4223-8d2a-9a1c24191d99","Type":"ContainerDied","Data":"fd1e37a6a7ac74fd4dab6e106e2a67c1b2a77f90cefeda57adc5543d681ce73c"} Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.947798 4797 scope.go:117] "RemoveContainer" containerID="df75fae43cfd850b1992da0675a8654efd4971a9b346815a07670649b0aeb8ba" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.948078 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74f97446fb-hwg4l" Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.949687 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" event={"ID":"7a254b0d-9c02-4245-89e1-166a052becfb","Type":"ContainerDied","Data":"611dd46a949b98f97b50901e13575fe3e174b9376af7921c0c5256bb21a826d3"} Sep 30 18:02:47 crc kubenswrapper[4797]: I0930 18:02:47.949816 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rgtlk" Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.017370 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rgtlk"] Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.028277 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rgtlk"] Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.036369 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74f97446fb-hwg4l"] Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.045152 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-74f97446fb-hwg4l"] Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.253845 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" path="/var/lib/kubelet/pods/2152a741-6045-4223-8d2a-9a1c24191d99/volumes" Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.254676 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a254b0d-9c02-4245-89e1-166a052becfb" path="/var/lib/kubelet/pods/7a254b0d-9c02-4245-89e1-166a052becfb/volumes" Sep 30 18:02:48 crc kubenswrapper[4797]: I0930 18:02:48.610487 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.261138 4797 scope.go:117] "RemoveContainer" containerID="c080b2266602c152fc73c4139b8eed2322891325c1ced93ea64e972e3f4266e5" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.427567 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.427874 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.458786 4797 scope.go:117] "RemoveContainer" containerID="07aec71389677ee2f119d654e20546ab8f91f269226735869c5683e94e2e7a5a" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.520612 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c5546bcbd-84n4q"] Sep 30 18:02:50 crc kubenswrapper[4797]: E0930 18:02:50.521074 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-httpd" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.521087 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-httpd" Sep 30 18:02:50 crc kubenswrapper[4797]: E0930 18:02:50.521111 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-api" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.521117 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-api" Sep 30 18:02:50 crc kubenswrapper[4797]: E0930 18:02:50.521137 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a254b0d-9c02-4245-89e1-166a052becfb" containerName="init" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.521142 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a254b0d-9c02-4245-89e1-166a052becfb" containerName="init" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.521313 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-httpd" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.521327 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a254b0d-9c02-4245-89e1-166a052becfb" containerName="init" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.521336 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2152a741-6045-4223-8d2a-9a1c24191d99" containerName="neutron-api" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.522419 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.527096 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.527292 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.535023 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c5546bcbd-84n4q"] Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.596601 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6676d4ddcd-sxf6l" podUID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.596683 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.597389 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"36a6adba377bc9ef5681d5d89a6434eaf7eb2d4e03e624ee8cb987e2724b3e14"} pod="openstack/horizon-6676d4ddcd-sxf6l" containerMessage="Container horizon failed startup probe, will be restarted" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.597426 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6676d4ddcd-sxf6l" podUID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerName="horizon" containerID="cri-o://36a6adba377bc9ef5681d5d89a6434eaf7eb2d4e03e624ee8cb987e2724b3e14" gracePeriod=30 Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.603522 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b748fb867-znqws" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651242 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-config-data\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651404 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9365f350-1fad-4ab1-a694-49912e391383-logs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651422 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-config-data-custom\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651488 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-internal-tls-certs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651536 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-combined-ca-bundle\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651602 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9px\" (UniqueName: \"kubernetes.io/projected/9365f350-1fad-4ab1-a694-49912e391383-kube-api-access-lq9px\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.651628 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-public-tls-certs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754037 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-public-tls-certs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754318 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-config-data\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754388 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9365f350-1fad-4ab1-a694-49912e391383-logs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754407 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-config-data-custom\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754426 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-internal-tls-certs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754479 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-combined-ca-bundle\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.754518 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9px\" (UniqueName: \"kubernetes.io/projected/9365f350-1fad-4ab1-a694-49912e391383-kube-api-access-lq9px\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.761168 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-config-data-custom\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.761502 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9365f350-1fad-4ab1-a694-49912e391383-logs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.763812 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-public-tls-certs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.769166 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-internal-tls-certs\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.769581 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-config-data\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.775043 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9365f350-1fad-4ab1-a694-49912e391383-combined-ca-bundle\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.793098 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9px\" (UniqueName: \"kubernetes.io/projected/9365f350-1fad-4ab1-a694-49912e391383-kube-api-access-lq9px\") pod \"barbican-api-7c5546bcbd-84n4q\" (UID: \"9365f350-1fad-4ab1-a694-49912e391383\") " pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.917109 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:50 crc kubenswrapper[4797]: I0930 18:02:50.996678 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.015175 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.025039 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-msj2h"] Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.026949 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5768c5b854-k959d" event={"ID":"9c9236f9-becb-4d5c-aeb5-56a3b0547c86","Type":"ContainerStarted","Data":"5d9f8c1ccbfca3217468763e9ed06a056f72e098746f1f41b52c95dd204bb2f9"} Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.030455 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b75d759d5-6bwm5" event={"ID":"080f211c-e410-4f16-af62-78ce0d6d9d26","Type":"ContainerStarted","Data":"c2ef5d61d5b25a7f27c12402ce1aa33d4aecb9cc3b4ac27fe5ae132e558353ea"} Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.059900 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerStarted","Data":"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf"} Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.064413 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.065082 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.065143 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.153669 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:51 crc kubenswrapper[4797]: I0930 18:02:51.577828 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c5546bcbd-84n4q"] Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.111047 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5768c5b854-k959d" event={"ID":"9c9236f9-becb-4d5c-aeb5-56a3b0547c86","Type":"ContainerStarted","Data":"d1254565a2acd058b060c3900eee4df6e637a82e0808175adee826c130154e1b"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.119084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b75d759d5-6bwm5" event={"ID":"080f211c-e410-4f16-af62-78ce0d6d9d26","Type":"ContainerStarted","Data":"587337285d835668a1f7e3ab39064eee2edbe75c014f32a0d4ce6c324af9ee82"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.126597 4797 generic.go:334] "Generic (PLEG): container finished" podID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerID="09c896db5122ba4e8b7d0f86c00eeeb562dc328690f35b63b8d79a09ea36c3ba" exitCode=0 Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.126666 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" event={"ID":"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2","Type":"ContainerDied","Data":"09c896db5122ba4e8b7d0f86c00eeeb562dc328690f35b63b8d79a09ea36c3ba"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.126692 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" event={"ID":"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2","Type":"ContainerStarted","Data":"459f0e5663e3f327c89e348a0c4b44d969d477db0ceed66225dbfcb3d768f577"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.135348 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerStarted","Data":"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.140053 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5768c5b854-k959d" podStartSLOduration=3.599875864 podStartE2EDuration="10.140032029s" podCreationTimestamp="2025-09-30 18:02:42 +0000 UTC" firstStartedPulling="2025-09-30 18:02:43.75358647 +0000 UTC m=+1214.276085698" lastFinishedPulling="2025-09-30 18:02:50.293742625 +0000 UTC m=+1220.816241863" observedRunningTime="2025-09-30 18:02:52.129925013 +0000 UTC m=+1222.652424251" watchObservedRunningTime="2025-09-30 18:02:52.140032029 +0000 UTC m=+1222.662531267" Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.148083 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19d21904-7699-4ee9-95e5-17c1621b2f6b","Type":"ContainerStarted","Data":"c19e5ac55b57603127d0e24084df8c88446825288e9bb67499586819066ae0d7"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.188191 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b75d759d5-6bwm5" podStartSLOduration=3.953835666 podStartE2EDuration="10.188169133s" podCreationTimestamp="2025-09-30 18:02:42 +0000 UTC" firstStartedPulling="2025-09-30 18:02:44.038295561 +0000 UTC m=+1214.560794799" lastFinishedPulling="2025-09-30 18:02:50.272629028 +0000 UTC m=+1220.795128266" observedRunningTime="2025-09-30 18:02:52.175414664 +0000 UTC m=+1222.697913902" watchObservedRunningTime="2025-09-30 18:02:52.188169133 +0000 UTC m=+1222.710668371" Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.197686 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c5546bcbd-84n4q" event={"ID":"9365f350-1fad-4ab1-a694-49912e391383","Type":"ContainerStarted","Data":"db4966005a17499849ba262bb10dc6210cabbb6c858bf2df8f744938aa7913ad"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.197727 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c5546bcbd-84n4q" event={"ID":"9365f350-1fad-4ab1-a694-49912e391383","Type":"ContainerStarted","Data":"7fd0bddc5fe15ae753c7976c7bbac0e2d9e28c6aecb3a1b6c58dbc539431604c"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.201381 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613","Type":"ContainerStarted","Data":"7a2e2a77f0558b49e55711146661b5567feb98427a71076a34ea78b5cd6d1d20"} Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.201830 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.264576 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 18:02:52 crc kubenswrapper[4797]: I0930 18:02:52.265019 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.230409 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.234754 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.245668 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.245697 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-f2pjb" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.245952 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.258779 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.355067 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerStarted","Data":"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a"} Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.368102 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19d21904-7699-4ee9-95e5-17c1621b2f6b","Type":"ContainerStarted","Data":"43de77b0e080dd002cfaaef827aed5aee427f288215446a6def7d8046a94ee3e"} Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.369821 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c5546bcbd-84n4q" event={"ID":"9365f350-1fad-4ab1-a694-49912e391383","Type":"ContainerStarted","Data":"b9beadf4f5474d24060c23aacef3694763329da29a788f232cfab0b0cd63e1e0"} Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.370916 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.370937 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.372120 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613","Type":"ContainerStarted","Data":"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704"} Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.374826 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" event={"ID":"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2","Type":"ContainerStarted","Data":"ee2aa224c3f2eacb283380e32219de391b77d94d5a09e4cf0deee6f7cc10fafa"} Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.406973 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c5546bcbd-84n4q" podStartSLOduration=3.40695644 podStartE2EDuration="3.40695644s" podCreationTimestamp="2025-09-30 18:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:53.403127726 +0000 UTC m=+1223.925626974" watchObservedRunningTime="2025-09-30 18:02:53.40695644 +0000 UTC m=+1223.929455678" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.417857 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-openstack-config-secret\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.417910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-openstack-config\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.418023 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8qp\" (UniqueName: \"kubernetes.io/projected/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-kube-api-access-9z8qp\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.418091 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.466526 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" podStartSLOduration=9.466504035 podStartE2EDuration="9.466504035s" podCreationTimestamp="2025-09-30 18:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:53.434128672 +0000 UTC m=+1223.956627910" watchObservedRunningTime="2025-09-30 18:02:53.466504035 +0000 UTC m=+1223.989003273" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.523506 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-openstack-config-secret\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.523582 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-openstack-config\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.523847 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8qp\" (UniqueName: \"kubernetes.io/projected/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-kube-api-access-9z8qp\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.524010 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.525678 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-openstack-config\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.538143 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-openstack-config-secret\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.547969 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.555768 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8qp\" (UniqueName: \"kubernetes.io/projected/2a0d8448-cb6f-4fe4-9458-fad3bfd11471-kube-api-access-9z8qp\") pod \"openstackclient\" (UID: \"2a0d8448-cb6f-4fe4-9458-fad3bfd11471\") " pod="openstack/openstackclient" Sep 30 18:02:53 crc kubenswrapper[4797]: I0930 18:02:53.568838 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.218230 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 18:02:54 crc kubenswrapper[4797]: W0930 18:02:54.222986 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0d8448_cb6f_4fe4_9458_fad3bfd11471.slice/crio-bfa47b24158c78404462fa8c5db1bbe056462b09026fe11453d1c097cc65b2f8 WatchSource:0}: Error finding container bfa47b24158c78404462fa8c5db1bbe056462b09026fe11453d1c097cc65b2f8: Status 404 returned error can't find the container with id bfa47b24158c78404462fa8c5db1bbe056462b09026fe11453d1c097cc65b2f8 Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.389415 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19d21904-7699-4ee9-95e5-17c1621b2f6b","Type":"ContainerStarted","Data":"4966429bbad0f180ffaf733a928b53c548205ac11ab3f7bd41be13d02e969c6b"} Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.397484 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2a0d8448-cb6f-4fe4-9458-fad3bfd11471","Type":"ContainerStarted","Data":"bfa47b24158c78404462fa8c5db1bbe056462b09026fe11453d1c097cc65b2f8"} Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.405226 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613","Type":"ContainerStarted","Data":"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648"} Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.405455 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api-log" containerID="cri-o://43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704" gracePeriod=30 Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.405490 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.405530 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api" containerID="cri-o://8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648" gracePeriod=30 Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.406848 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.418269 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.437320028 podStartE2EDuration="10.418247733s" podCreationTimestamp="2025-09-30 18:02:44 +0000 UTC" firstStartedPulling="2025-09-30 18:02:51.035993044 +0000 UTC m=+1221.558492282" lastFinishedPulling="2025-09-30 18:02:52.016920749 +0000 UTC m=+1222.539419987" observedRunningTime="2025-09-30 18:02:54.412545788 +0000 UTC m=+1224.935045026" watchObservedRunningTime="2025-09-30 18:02:54.418247733 +0000 UTC m=+1224.940746971" Sep 30 18:02:54 crc kubenswrapper[4797]: I0930 18:02:54.438294 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.438266219 podStartE2EDuration="9.438266219s" podCreationTimestamp="2025-09-30 18:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:54.43387779 +0000 UTC m=+1224.956377028" watchObservedRunningTime="2025-09-30 18:02:54.438266219 +0000 UTC m=+1224.960765457" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.098641 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269071 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-etc-machine-id\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269126 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-logs\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269175 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data-custom\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269236 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-scripts\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269270 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-combined-ca-bundle\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269323 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.269419 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m78kv\" (UniqueName: \"kubernetes.io/projected/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-kube-api-access-m78kv\") pod \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\" (UID: \"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613\") " Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.271314 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.271844 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-logs" (OuterVolumeSpecName: "logs") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.280659 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-kube-api-access-m78kv" (OuterVolumeSpecName: "kube-api-access-m78kv") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "kube-api-access-m78kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.281533 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-scripts" (OuterVolumeSpecName: "scripts") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.291598 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.333754 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.366972 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.374091 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.374115 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.374124 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.374133 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.374141 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.374151 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m78kv\" (UniqueName: \"kubernetes.io/projected/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-kube-api-access-m78kv\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.392531 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data" (OuterVolumeSpecName: "config-data") pod "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" (UID: "6c3ec1e8-06d1-4d0b-b495-b27a07ccf613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417252 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerID="8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648" exitCode=0 Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417287 4797 generic.go:334] "Generic (PLEG): container finished" podID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerID="43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704" exitCode=143 Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417399 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417460 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613","Type":"ContainerDied","Data":"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648"} Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417498 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613","Type":"ContainerDied","Data":"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704"} Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417507 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c3ec1e8-06d1-4d0b-b495-b27a07ccf613","Type":"ContainerDied","Data":"7a2e2a77f0558b49e55711146661b5567feb98427a71076a34ea78b5cd6d1d20"} Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.417521 4797 scope.go:117] "RemoveContainer" containerID="8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.422903 4797 generic.go:334] "Generic (PLEG): container finished" podID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerID="36a6adba377bc9ef5681d5d89a6434eaf7eb2d4e03e624ee8cb987e2724b3e14" exitCode=0 Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.422967 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6676d4ddcd-sxf6l" event={"ID":"04e30fb7-7876-4a90-b887-05b7da2f7746","Type":"ContainerDied","Data":"36a6adba377bc9ef5681d5d89a6434eaf7eb2d4e03e624ee8cb987e2724b3e14"} Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.434454 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerStarted","Data":"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398"} Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.445407 4797 scope.go:117] "RemoveContainer" containerID="43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.472326 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.980590346 podStartE2EDuration="12.472307954s" podCreationTimestamp="2025-09-30 18:02:43 +0000 UTC" firstStartedPulling="2025-09-30 18:02:44.604360322 +0000 UTC m=+1215.126859560" lastFinishedPulling="2025-09-30 18:02:54.09607793 +0000 UTC m=+1224.618577168" observedRunningTime="2025-09-30 18:02:55.466903126 +0000 UTC m=+1225.989402364" watchObservedRunningTime="2025-09-30 18:02:55.472307954 +0000 UTC m=+1225.994807192" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.473922 4797 scope.go:117] "RemoveContainer" containerID="8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.476045 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:02:55 crc kubenswrapper[4797]: E0930 18:02:55.476961 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648\": container with ID starting with 8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648 not found: ID does not exist" containerID="8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.476995 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648"} err="failed to get container status \"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648\": rpc error: code = NotFound desc = could not find container \"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648\": container with ID starting with 8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648 not found: ID does not exist" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.477019 4797 scope.go:117] "RemoveContainer" containerID="43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704" Sep 30 18:02:55 crc kubenswrapper[4797]: E0930 18:02:55.477881 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704\": container with ID starting with 43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704 not found: ID does not exist" containerID="43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.477908 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704"} err="failed to get container status \"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704\": rpc error: code = NotFound desc = could not find container \"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704\": container with ID starting with 43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704 not found: ID does not exist" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.477921 4797 scope.go:117] "RemoveContainer" containerID="8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.478688 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648"} err="failed to get container status \"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648\": rpc error: code = NotFound desc = could not find container \"8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648\": container with ID starting with 8cc081a2370eb86c20ef44adda39996d5745d06ba73c1819ef458099b915f648 not found: ID does not exist" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.478707 4797 scope.go:117] "RemoveContainer" containerID="43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.480798 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704"} err="failed to get container status \"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704\": rpc error: code = NotFound desc = could not find container \"43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704\": container with ID starting with 43ea2a1cdeb7f64d7273ae3f4a5dc01bd4e90d66134862db11dccea50e122704 not found: ID does not exist" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.505471 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.519311 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.529964 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:55 crc kubenswrapper[4797]: E0930 18:02:55.530533 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api-log" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.530553 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api-log" Sep 30 18:02:55 crc kubenswrapper[4797]: E0930 18:02:55.530568 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.530575 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.530836 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api-log" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.530873 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" containerName="cinder-api" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.532063 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.537785 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.537934 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.537942 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.541254 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.679294 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.679532 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.679606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.679683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-scripts\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.679922 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.680043 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.680097 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-logs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.680135 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fstgf\" (UniqueName: \"kubernetes.io/projected/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-kube-api-access-fstgf\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.680217 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-config-data\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.760252 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782570 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782692 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-logs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782723 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fstgf\" (UniqueName: \"kubernetes.io/projected/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-kube-api-access-fstgf\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782800 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-config-data\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782831 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.782845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.783037 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.783088 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.783140 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-scripts\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.783939 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-logs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.788895 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.792050 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.792742 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.792960 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-scripts\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.793861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-config-data\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.805107 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.808211 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fstgf\" (UniqueName: \"kubernetes.io/projected/b9f1efb9-4e3d-4371-bd43-55cffbe2d06d-kube-api-access-fstgf\") pod \"cinder-api-0\" (UID: \"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d\") " pod="openstack/cinder-api-0" Sep 30 18:02:55 crc kubenswrapper[4797]: I0930 18:02:55.875988 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 18:02:56 crc kubenswrapper[4797]: I0930 18:02:56.216917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:02:56 crc kubenswrapper[4797]: I0930 18:02:56.264061 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3ec1e8-06d1-4d0b-b495-b27a07ccf613" path="/var/lib/kubelet/pods/6c3ec1e8-06d1-4d0b-b495-b27a07ccf613/volumes" Sep 30 18:02:56 crc kubenswrapper[4797]: I0930 18:02:56.426817 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 18:02:56 crc kubenswrapper[4797]: W0930 18:02:56.436582 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9f1efb9_4e3d_4371_bd43_55cffbe2d06d.slice/crio-0f565696305a5258445909ae1104b9e24373e228d8ae7d1851fd6b84e9132b40 WatchSource:0}: Error finding container 0f565696305a5258445909ae1104b9e24373e228d8ae7d1851fd6b84e9132b40: Status 404 returned error can't find the container with id 0f565696305a5258445909ae1104b9e24373e228d8ae7d1851fd6b84e9132b40 Sep 30 18:02:56 crc kubenswrapper[4797]: I0930 18:02:56.476557 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6676d4ddcd-sxf6l" event={"ID":"04e30fb7-7876-4a90-b887-05b7da2f7746","Type":"ContainerStarted","Data":"78dc829cb21710458b2be6d9d3d0ee7dc173a839467476356be083937766ecfe"} Sep 30 18:02:56 crc kubenswrapper[4797]: I0930 18:02:56.482215 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d","Type":"ContainerStarted","Data":"0f565696305a5258445909ae1104b9e24373e228d8ae7d1851fd6b84e9132b40"} Sep 30 18:02:56 crc kubenswrapper[4797]: I0930 18:02:56.482254 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:02:57 crc kubenswrapper[4797]: I0930 18:02:57.503527 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d","Type":"ContainerStarted","Data":"4e67b5fb1331c6a32f5bef543883a2e05a4ac81f397f320c112c3c552bf3162d"} Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.520797 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9f1efb9-4e3d-4371-bd43-55cffbe2d06d","Type":"ContainerStarted","Data":"dc1f3c61628cb3bf38cff1516a2d9f5b121b54cece2cff9c4fe5852ecbcf1713"} Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.521406 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.546969 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.546949497 podStartE2EDuration="3.546949497s" podCreationTimestamp="2025-09-30 18:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:02:58.539893584 +0000 UTC m=+1229.062392822" watchObservedRunningTime="2025-09-30 18:02:58.546949497 +0000 UTC m=+1229.069448735" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.806051 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-594ff6944c-p2jp5"] Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.807998 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.810626 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.811251 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.811542 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.836064 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-594ff6944c-p2jp5"] Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.861991 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-run-httpd\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862031 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-internal-tls-certs\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862053 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-public-tls-certs\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-config-data\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862161 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-log-httpd\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwzbk\" (UniqueName: \"kubernetes.io/projected/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-kube-api-access-hwzbk\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862217 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-combined-ca-bundle\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.862241 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-etc-swift\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.963357 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-config-data\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.963707 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-log-httpd\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.963868 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwzbk\" (UniqueName: \"kubernetes.io/projected/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-kube-api-access-hwzbk\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.963971 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-combined-ca-bundle\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.964092 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-etc-swift\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.964207 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-run-httpd\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.964286 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-internal-tls-certs\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.964360 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-public-tls-certs\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.965028 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-run-httpd\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.968849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-log-httpd\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.972351 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-public-tls-certs\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.972751 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-combined-ca-bundle\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.973611 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-config-data\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.982472 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-etc-swift\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.984471 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwzbk\" (UniqueName: \"kubernetes.io/projected/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-kube-api-access-hwzbk\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:58 crc kubenswrapper[4797]: I0930 18:02:58.987190 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ca411e-ead4-4a2d-9eba-f3f8fffcad46-internal-tls-certs\") pod \"swift-proxy-594ff6944c-p2jp5\" (UID: \"89ca411e-ead4-4a2d-9eba-f3f8fffcad46\") " pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.137761 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.673028 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.673861 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="proxy-httpd" containerID="cri-o://1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" gracePeriod=30 Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.674085 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="sg-core" containerID="cri-o://cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" gracePeriod=30 Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.674203 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-notification-agent" containerID="cri-o://66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" gracePeriod=30 Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.674252 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-central-agent" containerID="cri-o://02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" gracePeriod=30 Sep 30 18:02:59 crc kubenswrapper[4797]: I0930 18:02:59.737964 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-594ff6944c-p2jp5"] Sep 30 18:03:00 crc kubenswrapper[4797]: E0930 18:03:00.317612 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebdbd5b8_ef3c_4ba9_927a_dfc963e8f11e.slice/crio-66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebdbd5b8_ef3c_4ba9_927a_dfc963e8f11e.slice/crio-conmon-66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff.scope\": RecentStats: unable to find data in memory cache]" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.405579 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.489420 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-mtmnm"] Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.489872 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerName="dnsmasq-dns" containerID="cri-o://4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891" gracePeriod=10 Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.530267 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570337 4797 generic.go:334] "Generic (PLEG): container finished" podID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" exitCode=0 Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570365 4797 generic.go:334] "Generic (PLEG): container finished" podID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" exitCode=2 Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570374 4797 generic.go:334] "Generic (PLEG): container finished" podID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" exitCode=0 Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570381 4797 generic.go:334] "Generic (PLEG): container finished" podID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" exitCode=0 Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570417 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerDied","Data":"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570456 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerDied","Data":"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570467 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerDied","Data":"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570477 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerDied","Data":"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570485 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e","Type":"ContainerDied","Data":"4ed8c737d78a9cd88b2e314375229ac1529b412e1cc95b7de6203300c0c33b38"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570500 4797 scope.go:117] "RemoveContainer" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.570634 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.582851 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-594ff6944c-p2jp5" event={"ID":"89ca411e-ead4-4a2d-9eba-f3f8fffcad46","Type":"ContainerStarted","Data":"52d9ca53411dfc89fb8e935cf46dc5eb8d50cbda6bd87a59613e556e10113e67"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.582880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-594ff6944c-p2jp5" event={"ID":"89ca411e-ead4-4a2d-9eba-f3f8fffcad46","Type":"ContainerStarted","Data":"83ef11b2731d7a1ed07ff3e53c659a655755c94c4ed1cad625e3d0dc039927a3"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.582890 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-594ff6944c-p2jp5" event={"ID":"89ca411e-ead4-4a2d-9eba-f3f8fffcad46","Type":"ContainerStarted","Data":"ae46fbca029af72908ee6a9b748f59c0e7760bedeb115c4392ae3a588ce6b123"} Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.584048 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.584069 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.610174 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-594ff6944c-p2jp5" podStartSLOduration=2.610158622 podStartE2EDuration="2.610158622s" podCreationTimestamp="2025-09-30 18:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:00.609455693 +0000 UTC m=+1231.131954951" watchObservedRunningTime="2025-09-30 18:03:00.610158622 +0000 UTC m=+1231.132657860" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.650650 4797 scope.go:117] "RemoveContainer" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.678588 4797 scope.go:117] "RemoveContainer" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704158 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mbhj\" (UniqueName: \"kubernetes.io/projected/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-kube-api-access-7mbhj\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704249 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-log-httpd\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704410 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-combined-ca-bundle\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704426 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-config-data\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704458 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-scripts\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704471 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-sg-core-conf-yaml\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.704492 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-run-httpd\") pod \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\" (UID: \"ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e\") " Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.705067 4797 scope.go:117] "RemoveContainer" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.705331 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.705667 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.709593 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-scripts" (OuterVolumeSpecName: "scripts") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.710108 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-kube-api-access-7mbhj" (OuterVolumeSpecName: "kube-api-access-7mbhj") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "kube-api-access-7mbhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.750637 4797 scope.go:117] "RemoveContainer" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.768902 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 18:03:00 crc kubenswrapper[4797]: E0930 18:03:00.780357 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": container with ID starting with 1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398 not found: ID does not exist" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.780401 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398"} err="failed to get container status \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": rpc error: code = NotFound desc = could not find container \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": container with ID starting with 1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398 not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.780442 4797 scope.go:117] "RemoveContainer" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" Sep 30 18:03:00 crc kubenswrapper[4797]: E0930 18:03:00.781339 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": container with ID starting with cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a not found: ID does not exist" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.781389 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a"} err="failed to get container status \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": rpc error: code = NotFound desc = could not find container \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": container with ID starting with cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.781446 4797 scope.go:117] "RemoveContainer" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" Sep 30 18:03:00 crc kubenswrapper[4797]: E0930 18:03:00.781911 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": container with ID starting with 66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff not found: ID does not exist" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.781940 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff"} err="failed to get container status \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": rpc error: code = NotFound desc = could not find container \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": container with ID starting with 66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.781956 4797 scope.go:117] "RemoveContainer" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" Sep 30 18:03:00 crc kubenswrapper[4797]: E0930 18:03:00.782178 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": container with ID starting with 02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf not found: ID does not exist" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.782211 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf"} err="failed to get container status \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": rpc error: code = NotFound desc = could not find container \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": container with ID starting with 02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.782224 4797 scope.go:117] "RemoveContainer" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.782508 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398"} err="failed to get container status \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": rpc error: code = NotFound desc = could not find container \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": container with ID starting with 1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398 not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.782527 4797 scope.go:117] "RemoveContainer" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.785131 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a"} err="failed to get container status \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": rpc error: code = NotFound desc = could not find container \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": container with ID starting with cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.785170 4797 scope.go:117] "RemoveContainer" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.786069 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff"} err="failed to get container status \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": rpc error: code = NotFound desc = could not find container \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": container with ID starting with 66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.786086 4797 scope.go:117] "RemoveContainer" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.786522 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf"} err="failed to get container status \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": rpc error: code = NotFound desc = could not find container \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": container with ID starting with 02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.786545 4797 scope.go:117] "RemoveContainer" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.788937 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398"} err="failed to get container status \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": rpc error: code = NotFound desc = could not find container \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": container with ID starting with 1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398 not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.788989 4797 scope.go:117] "RemoveContainer" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.796661 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a"} err="failed to get container status \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": rpc error: code = NotFound desc = could not find container \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": container with ID starting with cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.796702 4797 scope.go:117] "RemoveContainer" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.797596 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff"} err="failed to get container status \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": rpc error: code = NotFound desc = could not find container \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": container with ID starting with 66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.797616 4797 scope.go:117] "RemoveContainer" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.797746 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.798061 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf"} err="failed to get container status \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": rpc error: code = NotFound desc = could not find container \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": container with ID starting with 02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.798087 4797 scope.go:117] "RemoveContainer" containerID="1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.798610 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398"} err="failed to get container status \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": rpc error: code = NotFound desc = could not find container \"1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398\": container with ID starting with 1c035305a89b136b2ade974e4dc86748629c120c71c72e039dde6903a5ea8398 not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.798632 4797 scope.go:117] "RemoveContainer" containerID="cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.799881 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a"} err="failed to get container status \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": rpc error: code = NotFound desc = could not find container \"cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a\": container with ID starting with cd1bf4868c9a3959006e290519483eb7f24344bfa4cfccc2e1debc5a10f0cc8a not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.799913 4797 scope.go:117] "RemoveContainer" containerID="66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.801302 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff"} err="failed to get container status \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": rpc error: code = NotFound desc = could not find container \"66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff\": container with ID starting with 66d80ce0dc8075e7efc81001278a25ace089b956792b499c2cfcf8d26a544fff not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.801329 4797 scope.go:117] "RemoveContainer" containerID="02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.817171 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.817191 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.817199 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.817213 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mbhj\" (UniqueName: \"kubernetes.io/projected/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-kube-api-access-7mbhj\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.817222 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.817316 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf"} err="failed to get container status \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": rpc error: code = NotFound desc = could not find container \"02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf\": container with ID starting with 02e49e26a4354a5f686c62ff4f36d3464c57f8b6555a07a1abed7416dfc7f7bf not found: ID does not exist" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.874074 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.909643 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.925226 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:00 crc kubenswrapper[4797]: I0930 18:03:00.959244 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-config-data" (OuterVolumeSpecName: "config-data") pod "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" (UID: "ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.032029 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.185684 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.202505 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.209420 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226277 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.226703 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerName="init" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226717 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerName="init" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.226733 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="sg-core" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226740 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="sg-core" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.226752 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-central-agent" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226759 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-central-agent" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.226786 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-notification-agent" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226793 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-notification-agent" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.226800 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="proxy-httpd" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226805 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="proxy-httpd" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.226821 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerName="dnsmasq-dns" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226828 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerName="dnsmasq-dns" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.226996 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-notification-agent" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.227010 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="proxy-httpd" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.227023 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerName="dnsmasq-dns" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.227043 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="ceilometer-central-agent" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.227054 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" containerName="sg-core" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.229149 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.235273 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.235333 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.254337 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.338966 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkbv4\" (UniqueName: \"kubernetes.io/projected/a263de13-afff-4c1b-8b43-8ecfac6c9855-kube-api-access-hkbv4\") pod \"a263de13-afff-4c1b-8b43-8ecfac6c9855\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.339214 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-config\") pod \"a263de13-afff-4c1b-8b43-8ecfac6c9855\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.339345 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-sb\") pod \"a263de13-afff-4c1b-8b43-8ecfac6c9855\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.339458 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-nb\") pod \"a263de13-afff-4c1b-8b43-8ecfac6c9855\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.339681 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-svc\") pod \"a263de13-afff-4c1b-8b43-8ecfac6c9855\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.339754 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-swift-storage-0\") pod \"a263de13-afff-4c1b-8b43-8ecfac6c9855\" (UID: \"a263de13-afff-4c1b-8b43-8ecfac6c9855\") " Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340019 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-run-httpd\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340094 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340266 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-log-httpd\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340341 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6xq\" (UniqueName: \"kubernetes.io/projected/b933ea23-2970-4ec2-a390-8da0bb811eca-kube-api-access-zd6xq\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340409 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-scripts\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.340650 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-config-data\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.361600 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a263de13-afff-4c1b-8b43-8ecfac6c9855-kube-api-access-hkbv4" (OuterVolumeSpecName: "kube-api-access-hkbv4") pod "a263de13-afff-4c1b-8b43-8ecfac6c9855" (UID: "a263de13-afff-4c1b-8b43-8ecfac6c9855"). InnerVolumeSpecName "kube-api-access-hkbv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.442830 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-config-data\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.442902 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-run-httpd\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.442924 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.442990 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-log-httpd\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.443006 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6xq\" (UniqueName: \"kubernetes.io/projected/b933ea23-2970-4ec2-a390-8da0bb811eca-kube-api-access-zd6xq\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.443024 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-scripts\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.443073 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.443115 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkbv4\" (UniqueName: \"kubernetes.io/projected/a263de13-afff-4c1b-8b43-8ecfac6c9855-kube-api-access-hkbv4\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.453640 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.454759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-log-httpd\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.455058 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-run-httpd\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.469519 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a263de13-afff-4c1b-8b43-8ecfac6c9855" (UID: "a263de13-afff-4c1b-8b43-8ecfac6c9855"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.469827 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-config-data\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.472300 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.484266 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-scripts\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.491396 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a263de13-afff-4c1b-8b43-8ecfac6c9855" (UID: "a263de13-afff-4c1b-8b43-8ecfac6c9855"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.494078 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6xq\" (UniqueName: \"kubernetes.io/projected/b933ea23-2970-4ec2-a390-8da0bb811eca-kube-api-access-zd6xq\") pod \"ceilometer-0\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.500482 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a263de13-afff-4c1b-8b43-8ecfac6c9855" (UID: "a263de13-afff-4c1b-8b43-8ecfac6c9855"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.523932 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a263de13-afff-4c1b-8b43-8ecfac6c9855" (UID: "a263de13-afff-4c1b-8b43-8ecfac6c9855"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.538043 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-config" (OuterVolumeSpecName: "config") pod "a263de13-afff-4c1b-8b43-8ecfac6c9855" (UID: "a263de13-afff-4c1b-8b43-8ecfac6c9855"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.547634 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.547665 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.547677 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.547686 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.547696 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a263de13-afff-4c1b-8b43-8ecfac6c9855-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.553169 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.603754 4797 generic.go:334] "Generic (PLEG): container finished" podID="a263de13-afff-4c1b-8b43-8ecfac6c9855" containerID="4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891" exitCode=0 Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.603881 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" event={"ID":"a263de13-afff-4c1b-8b43-8ecfac6c9855","Type":"ContainerDied","Data":"4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891"} Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.603912 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.603947 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-mtmnm" event={"ID":"a263de13-afff-4c1b-8b43-8ecfac6c9855","Type":"ContainerDied","Data":"77569fc9a44ec6de606022d1154d685ddad58210d74c8d3a752fab8bfb7b95bf"} Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.603969 4797 scope.go:117] "RemoveContainer" containerID="4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.617561 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="cinder-scheduler" containerID="cri-o://43de77b0e080dd002cfaaef827aed5aee427f288215446a6def7d8046a94ee3e" gracePeriod=30 Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.618097 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="probe" containerID="cri-o://4966429bbad0f180ffaf733a928b53c548205ac11ab3f7bd41be13d02e969c6b" gracePeriod=30 Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.650486 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-mtmnm"] Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.658227 4797 scope.go:117] "RemoveContainer" containerID="653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.659952 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-mtmnm"] Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.723356 4797 scope.go:117] "RemoveContainer" containerID="4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.729190 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891\": container with ID starting with 4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891 not found: ID does not exist" containerID="4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.729224 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891"} err="failed to get container status \"4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891\": rpc error: code = NotFound desc = could not find container \"4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891\": container with ID starting with 4d61d7fef4abc31f37ebf2fd704d87268b749de94b4c7ff625c8cf2e7d325891 not found: ID does not exist" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.729252 4797 scope.go:117] "RemoveContainer" containerID="653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f" Sep 30 18:03:01 crc kubenswrapper[4797]: E0930 18:03:01.730686 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f\": container with ID starting with 653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f not found: ID does not exist" containerID="653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f" Sep 30 18:03:01 crc kubenswrapper[4797]: I0930 18:03:01.730706 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f"} err="failed to get container status \"653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f\": rpc error: code = NotFound desc = could not find container \"653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f\": container with ID starting with 653e3ed0006039de8d513ada803a5c46d36eae4a2bb39bc16e697478ddbf634f not found: ID does not exist" Sep 30 18:03:02 crc kubenswrapper[4797]: I0930 18:03:02.103874 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:02 crc kubenswrapper[4797]: I0930 18:03:02.253555 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a263de13-afff-4c1b-8b43-8ecfac6c9855" path="/var/lib/kubelet/pods/a263de13-afff-4c1b-8b43-8ecfac6c9855/volumes" Sep 30 18:03:02 crc kubenswrapper[4797]: I0930 18:03:02.254200 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e" path="/var/lib/kubelet/pods/ebdbd5b8-ef3c-4ba9-927a-dfc963e8f11e/volumes" Sep 30 18:03:02 crc kubenswrapper[4797]: I0930 18:03:02.636244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerStarted","Data":"173fab448e629adbb14e7e0d82eeda316754443424bbe69bc57d287561fdb2c5"} Sep 30 18:03:02 crc kubenswrapper[4797]: I0930 18:03:02.639212 4797 generic.go:334] "Generic (PLEG): container finished" podID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerID="4966429bbad0f180ffaf733a928b53c548205ac11ab3f7bd41be13d02e969c6b" exitCode=0 Sep 30 18:03:02 crc kubenswrapper[4797]: I0930 18:03:02.639297 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19d21904-7699-4ee9-95e5-17c1621b2f6b","Type":"ContainerDied","Data":"4966429bbad0f180ffaf733a928b53c548205ac11ab3f7bd41be13d02e969c6b"} Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.309785 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.539117 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c5546bcbd-84n4q" Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.596503 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbffffb8-2h9zs"] Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.596978 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbffffb8-2h9zs" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api-log" containerID="cri-o://8415d534e865748dd514b8e93227d1586e5e9892fb6d2447d1aae16090822896" gracePeriod=30 Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.597384 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbffffb8-2h9zs" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api" containerID="cri-o://c6cf3c12ebb6936a344d9193d65cfe2fdeef6a522bdcdca83805e60e49e579ea" gracePeriod=30 Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.674644 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerStarted","Data":"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a"} Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.678694 4797 generic.go:334] "Generic (PLEG): container finished" podID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerID="43de77b0e080dd002cfaaef827aed5aee427f288215446a6def7d8046a94ee3e" exitCode=0 Sep 30 18:03:03 crc kubenswrapper[4797]: I0930 18:03:03.680256 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19d21904-7699-4ee9-95e5-17c1621b2f6b","Type":"ContainerDied","Data":"43de77b0e080dd002cfaaef827aed5aee427f288215446a6def7d8046a94ee3e"} Sep 30 18:03:04 crc kubenswrapper[4797]: I0930 18:03:04.698518 4797 generic.go:334] "Generic (PLEG): container finished" podID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerID="8415d534e865748dd514b8e93227d1586e5e9892fb6d2447d1aae16090822896" exitCode=143 Sep 30 18:03:04 crc kubenswrapper[4797]: I0930 18:03:04.698599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbffffb8-2h9zs" event={"ID":"7a0b8e42-729d-430b-a9bf-99d3e949e06d","Type":"ContainerDied","Data":"8415d534e865748dd514b8e93227d1586e5e9892fb6d2447d1aae16090822896"} Sep 30 18:03:05 crc kubenswrapper[4797]: I0930 18:03:05.582557 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:03:05 crc kubenswrapper[4797]: I0930 18:03:05.582613 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:03:05 crc kubenswrapper[4797]: I0930 18:03:05.583655 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6676d4ddcd-sxf6l" podUID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Sep 30 18:03:06 crc kubenswrapper[4797]: I0930 18:03:06.842322 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbffffb8-2h9zs" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": read tcp 10.217.0.2:48550->10.217.0.181:9311: read: connection reset by peer" Sep 30 18:03:06 crc kubenswrapper[4797]: I0930 18:03:06.842328 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbffffb8-2h9zs" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": read tcp 10.217.0.2:48548->10.217.0.181:9311: read: connection reset by peer" Sep 30 18:03:07 crc kubenswrapper[4797]: I0930 18:03:07.740543 4797 generic.go:334] "Generic (PLEG): container finished" podID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerID="c6cf3c12ebb6936a344d9193d65cfe2fdeef6a522bdcdca83805e60e49e579ea" exitCode=0 Sep 30 18:03:07 crc kubenswrapper[4797]: I0930 18:03:07.740597 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbffffb8-2h9zs" event={"ID":"7a0b8e42-729d-430b-a9bf-99d3e949e06d","Type":"ContainerDied","Data":"c6cf3c12ebb6936a344d9193d65cfe2fdeef6a522bdcdca83805e60e49e579ea"} Sep 30 18:03:08 crc kubenswrapper[4797]: I0930 18:03:08.696522 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 18:03:08 crc kubenswrapper[4797]: I0930 18:03:08.777463 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbffffb8-2h9zs" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": dial tcp 10.217.0.181:9311: connect: connection refused" Sep 30 18:03:08 crc kubenswrapper[4797]: I0930 18:03:08.777758 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbffffb8-2h9zs" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": dial tcp 10.217.0.181:9311: connect: connection refused" Sep 30 18:03:08 crc kubenswrapper[4797]: I0930 18:03:08.910663 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:09 crc kubenswrapper[4797]: I0930 18:03:09.145409 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:03:09 crc kubenswrapper[4797]: I0930 18:03:09.150615 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-594ff6944c-p2jp5" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.041318 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5hxvf"] Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.047152 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.074594 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5hxvf"] Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.089466 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-sqrmz"] Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.092330 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.108852 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sqrmz"] Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.164880 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsdm\" (UniqueName: \"kubernetes.io/projected/e516f41b-fb09-421f-9cb2-a10e1a24f02c-kube-api-access-2hsdm\") pod \"nova-api-db-create-5hxvf\" (UID: \"e516f41b-fb09-421f-9cb2-a10e1a24f02c\") " pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.266095 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsdm\" (UniqueName: \"kubernetes.io/projected/e516f41b-fb09-421f-9cb2-a10e1a24f02c-kube-api-access-2hsdm\") pod \"nova-api-db-create-5hxvf\" (UID: \"e516f41b-fb09-421f-9cb2-a10e1a24f02c\") " pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.266394 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgt9\" (UniqueName: \"kubernetes.io/projected/e0b9e58f-0903-462b-8e48-b8fbaca162d9-kube-api-access-kvgt9\") pod \"nova-cell0-db-create-sqrmz\" (UID: \"e0b9e58f-0903-462b-8e48-b8fbaca162d9\") " pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.276663 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mnf4c"] Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.285777 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mnf4c"] Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.285928 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.291837 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsdm\" (UniqueName: \"kubernetes.io/projected/e516f41b-fb09-421f-9cb2-a10e1a24f02c-kube-api-access-2hsdm\") pod \"nova-api-db-create-5hxvf\" (UID: \"e516f41b-fb09-421f-9cb2-a10e1a24f02c\") " pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.368619 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgt9\" (UniqueName: \"kubernetes.io/projected/e0b9e58f-0903-462b-8e48-b8fbaca162d9-kube-api-access-kvgt9\") pod \"nova-cell0-db-create-sqrmz\" (UID: \"e0b9e58f-0903-462b-8e48-b8fbaca162d9\") " pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.368696 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnfdm\" (UniqueName: \"kubernetes.io/projected/b92a15c8-7839-4cad-ab41-953e0d5c44f1-kube-api-access-vnfdm\") pod \"nova-cell1-db-create-mnf4c\" (UID: \"b92a15c8-7839-4cad-ab41-953e0d5c44f1\") " pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.387296 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgt9\" (UniqueName: \"kubernetes.io/projected/e0b9e58f-0903-462b-8e48-b8fbaca162d9-kube-api-access-kvgt9\") pod \"nova-cell0-db-create-sqrmz\" (UID: \"e0b9e58f-0903-462b-8e48-b8fbaca162d9\") " pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.393746 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.413153 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.434198 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.469602 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0b8e42-729d-430b-a9bf-99d3e949e06d-logs\") pod \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.470254 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85zhg\" (UniqueName: \"kubernetes.io/projected/7a0b8e42-729d-430b-a9bf-99d3e949e06d-kube-api-access-85zhg\") pod \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.470261 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a0b8e42-729d-430b-a9bf-99d3e949e06d-logs" (OuterVolumeSpecName: "logs") pod "7a0b8e42-729d-430b-a9bf-99d3e949e06d" (UID: "7a0b8e42-729d-430b-a9bf-99d3e949e06d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.470661 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-combined-ca-bundle\") pod \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.470824 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data-custom\") pod \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.470946 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data\") pod \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\" (UID: \"7a0b8e42-729d-430b-a9bf-99d3e949e06d\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.471647 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnfdm\" (UniqueName: \"kubernetes.io/projected/b92a15c8-7839-4cad-ab41-953e0d5c44f1-kube-api-access-vnfdm\") pod \"nova-cell1-db-create-mnf4c\" (UID: \"b92a15c8-7839-4cad-ab41-953e0d5c44f1\") " pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.472345 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0b8e42-729d-430b-a9bf-99d3e949e06d-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.477809 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0b8e42-729d-430b-a9bf-99d3e949e06d-kube-api-access-85zhg" (OuterVolumeSpecName: "kube-api-access-85zhg") pod "7a0b8e42-729d-430b-a9bf-99d3e949e06d" (UID: "7a0b8e42-729d-430b-a9bf-99d3e949e06d"). InnerVolumeSpecName "kube-api-access-85zhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.479660 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a0b8e42-729d-430b-a9bf-99d3e949e06d" (UID: "7a0b8e42-729d-430b-a9bf-99d3e949e06d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.490392 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.491209 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnfdm\" (UniqueName: \"kubernetes.io/projected/b92a15c8-7839-4cad-ab41-953e0d5c44f1-kube-api-access-vnfdm\") pod \"nova-cell1-db-create-mnf4c\" (UID: \"b92a15c8-7839-4cad-ab41-953e0d5c44f1\") " pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.576174 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d89f\" (UniqueName: \"kubernetes.io/projected/19d21904-7699-4ee9-95e5-17c1621b2f6b-kube-api-access-7d89f\") pod \"19d21904-7699-4ee9-95e5-17c1621b2f6b\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.576563 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data-custom\") pod \"19d21904-7699-4ee9-95e5-17c1621b2f6b\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.576729 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data\") pod \"19d21904-7699-4ee9-95e5-17c1621b2f6b\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.576914 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19d21904-7699-4ee9-95e5-17c1621b2f6b-etc-machine-id\") pod \"19d21904-7699-4ee9-95e5-17c1621b2f6b\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.577056 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-scripts\") pod \"19d21904-7699-4ee9-95e5-17c1621b2f6b\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.577216 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-combined-ca-bundle\") pod \"19d21904-7699-4ee9-95e5-17c1621b2f6b\" (UID: \"19d21904-7699-4ee9-95e5-17c1621b2f6b\") " Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.577912 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.578537 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85zhg\" (UniqueName: \"kubernetes.io/projected/7a0b8e42-729d-430b-a9bf-99d3e949e06d-kube-api-access-85zhg\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.578582 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19d21904-7699-4ee9-95e5-17c1621b2f6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "19d21904-7699-4ee9-95e5-17c1621b2f6b" (UID: "19d21904-7699-4ee9-95e5-17c1621b2f6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.586589 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a0b8e42-729d-430b-a9bf-99d3e949e06d" (UID: "7a0b8e42-729d-430b-a9bf-99d3e949e06d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.586636 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d21904-7699-4ee9-95e5-17c1621b2f6b-kube-api-access-7d89f" (OuterVolumeSpecName: "kube-api-access-7d89f") pod "19d21904-7699-4ee9-95e5-17c1621b2f6b" (UID: "19d21904-7699-4ee9-95e5-17c1621b2f6b"). InnerVolumeSpecName "kube-api-access-7d89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.586787 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-scripts" (OuterVolumeSpecName: "scripts") pod "19d21904-7699-4ee9-95e5-17c1621b2f6b" (UID: "19d21904-7699-4ee9-95e5-17c1621b2f6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.586888 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data" (OuterVolumeSpecName: "config-data") pod "7a0b8e42-729d-430b-a9bf-99d3e949e06d" (UID: "7a0b8e42-729d-430b-a9bf-99d3e949e06d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.593595 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "19d21904-7699-4ee9-95e5-17c1621b2f6b" (UID: "19d21904-7699-4ee9-95e5-17c1621b2f6b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.663015 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.692682 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.692718 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.692727 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0b8e42-729d-430b-a9bf-99d3e949e06d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.692738 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d89f\" (UniqueName: \"kubernetes.io/projected/19d21904-7699-4ee9-95e5-17c1621b2f6b-kube-api-access-7d89f\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.692748 4797 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.692756 4797 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19d21904-7699-4ee9-95e5-17c1621b2f6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.709704 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19d21904-7699-4ee9-95e5-17c1621b2f6b" (UID: "19d21904-7699-4ee9-95e5-17c1621b2f6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.794530 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.836186 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2a0d8448-cb6f-4fe4-9458-fad3bfd11471","Type":"ContainerStarted","Data":"986c0ad2e242005b7c36988ac773eee5385111421d5e798c0ffc4955aaa0a1f2"} Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.862641 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbffffb8-2h9zs" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.864526 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbffffb8-2h9zs" event={"ID":"7a0b8e42-729d-430b-a9bf-99d3e949e06d","Type":"ContainerDied","Data":"299b24373bf5385e853cfb7f21d1d8a2381f6e49987375dde74a5986e0a02768"} Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.864655 4797 scope.go:117] "RemoveContainer" containerID="c6cf3c12ebb6936a344d9193d65cfe2fdeef6a522bdcdca83805e60e49e579ea" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.879986 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.205224854 podStartE2EDuration="18.879962093s" podCreationTimestamp="2025-09-30 18:02:53 +0000 UTC" firstStartedPulling="2025-09-30 18:02:54.232865463 +0000 UTC m=+1224.755364701" lastFinishedPulling="2025-09-30 18:03:10.907602702 +0000 UTC m=+1241.430101940" observedRunningTime="2025-09-30 18:03:11.86188876 +0000 UTC m=+1242.384387998" watchObservedRunningTime="2025-09-30 18:03:11.879962093 +0000 UTC m=+1242.402461331" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.901145 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerStarted","Data":"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1"} Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.910896 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data" (OuterVolumeSpecName: "config-data") pod "19d21904-7699-4ee9-95e5-17c1621b2f6b" (UID: "19d21904-7699-4ee9-95e5-17c1621b2f6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.933777 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19d21904-7699-4ee9-95e5-17c1621b2f6b","Type":"ContainerDied","Data":"c19e5ac55b57603127d0e24084df8c88446825288e9bb67499586819066ae0d7"} Sep 30 18:03:11 crc kubenswrapper[4797]: I0930 18:03:11.933859 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:11.999837 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d21904-7699-4ee9-95e5-17c1621b2f6b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.015618 4797 scope.go:117] "RemoveContainer" containerID="8415d534e865748dd514b8e93227d1586e5e9892fb6d2447d1aae16090822896" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.037509 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.046906 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.055492 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbffffb8-2h9zs"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.063737 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5bbffffb8-2h9zs"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.073100 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5hxvf"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.082587 4797 scope.go:117] "RemoveContainer" containerID="4966429bbad0f180ffaf733a928b53c548205ac11ab3f7bd41be13d02e969c6b" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.082664 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:03:12 crc kubenswrapper[4797]: E0930 18:03:12.083121 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083136 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api" Sep 30 18:03:12 crc kubenswrapper[4797]: E0930 18:03:12.083154 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="cinder-scheduler" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083163 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="cinder-scheduler" Sep 30 18:03:12 crc kubenswrapper[4797]: E0930 18:03:12.083182 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api-log" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083218 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api-log" Sep 30 18:03:12 crc kubenswrapper[4797]: E0930 18:03:12.083242 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="probe" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083251 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="probe" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083456 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083477 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" containerName="barbican-api-log" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083492 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="cinder-scheduler" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.083508 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" containerName="probe" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.084581 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.089770 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.091513 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.185080 4797 scope.go:117] "RemoveContainer" containerID="43de77b0e080dd002cfaaef827aed5aee427f288215446a6def7d8046a94ee3e" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.216123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.216165 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.216244 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-scripts\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.216275 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-config-data\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.216302 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6910128f-6ddf-4edf-86b6-a313f85db70d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.216322 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gbk6\" (UniqueName: \"kubernetes.io/projected/6910128f-6ddf-4edf-86b6-a313f85db70d-kube-api-access-4gbk6\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.265635 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d21904-7699-4ee9-95e5-17c1621b2f6b" path="/var/lib/kubelet/pods/19d21904-7699-4ee9-95e5-17c1621b2f6b/volumes" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.267275 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0b8e42-729d-430b-a9bf-99d3e949e06d" path="/var/lib/kubelet/pods/7a0b8e42-729d-430b-a9bf-99d3e949e06d/volumes" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.268021 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sqrmz"] Sep 30 18:03:12 crc kubenswrapper[4797]: W0930 18:03:12.277557 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0b9e58f_0903_462b_8e48_b8fbaca162d9.slice/crio-49a4fbeb9034e5fce43fa5bcf03d7c85a2211c2ffce51271f7df4f4dd2de0812 WatchSource:0}: Error finding container 49a4fbeb9034e5fce43fa5bcf03d7c85a2211c2ffce51271f7df4f4dd2de0812: Status 404 returned error can't find the container with id 49a4fbeb9034e5fce43fa5bcf03d7c85a2211c2ffce51271f7df4f4dd2de0812 Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.318623 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.318673 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.318735 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-scripts\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.318772 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-config-data\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.318799 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6910128f-6ddf-4edf-86b6-a313f85db70d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.318821 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gbk6\" (UniqueName: \"kubernetes.io/projected/6910128f-6ddf-4edf-86b6-a313f85db70d-kube-api-access-4gbk6\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.320750 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6910128f-6ddf-4edf-86b6-a313f85db70d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.328133 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.328135 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.329892 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-scripts\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.333716 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6910128f-6ddf-4edf-86b6-a313f85db70d-config-data\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.335410 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gbk6\" (UniqueName: \"kubernetes.io/projected/6910128f-6ddf-4edf-86b6-a313f85db70d-kube-api-access-4gbk6\") pod \"cinder-scheduler-0\" (UID: \"6910128f-6ddf-4edf-86b6-a313f85db70d\") " pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.385193 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mnf4c"] Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.461642 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.947573 4797 generic.go:334] "Generic (PLEG): container finished" podID="b92a15c8-7839-4cad-ab41-953e0d5c44f1" containerID="83892b750d8911c3cd7169bbb2c7bb7a3a8e0b8cf09aecc950287771af0656e3" exitCode=0 Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.947664 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mnf4c" event={"ID":"b92a15c8-7839-4cad-ab41-953e0d5c44f1","Type":"ContainerDied","Data":"83892b750d8911c3cd7169bbb2c7bb7a3a8e0b8cf09aecc950287771af0656e3"} Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.947842 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mnf4c" event={"ID":"b92a15c8-7839-4cad-ab41-953e0d5c44f1","Type":"ContainerStarted","Data":"6705ea344024592056aafda2fe852449d704f9fe75492b28856a50e571724f78"} Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.949022 4797 generic.go:334] "Generic (PLEG): container finished" podID="e516f41b-fb09-421f-9cb2-a10e1a24f02c" containerID="fcd6b1d83814a5dca340e0044d874c9c9722f62af998382cdaf15f3faeff0396" exitCode=0 Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.949091 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5hxvf" event={"ID":"e516f41b-fb09-421f-9cb2-a10e1a24f02c","Type":"ContainerDied","Data":"fcd6b1d83814a5dca340e0044d874c9c9722f62af998382cdaf15f3faeff0396"} Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.949111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5hxvf" event={"ID":"e516f41b-fb09-421f-9cb2-a10e1a24f02c","Type":"ContainerStarted","Data":"21024b000d5f088818d612724f1183630dfdde9062e0fb9f8dd590e7d6bfb66a"} Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.951069 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerStarted","Data":"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2"} Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.952824 4797 generic.go:334] "Generic (PLEG): container finished" podID="e0b9e58f-0903-462b-8e48-b8fbaca162d9" containerID="8dcf56f382679b43f160733ed83792467ab240c2bdc493a6fb819a33fa2d16df" exitCode=0 Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.952872 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sqrmz" event={"ID":"e0b9e58f-0903-462b-8e48-b8fbaca162d9","Type":"ContainerDied","Data":"8dcf56f382679b43f160733ed83792467ab240c2bdc493a6fb819a33fa2d16df"} Sep 30 18:03:12 crc kubenswrapper[4797]: I0930 18:03:12.952893 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sqrmz" event={"ID":"e0b9e58f-0903-462b-8e48-b8fbaca162d9","Type":"ContainerStarted","Data":"49a4fbeb9034e5fce43fa5bcf03d7c85a2211c2ffce51271f7df4f4dd2de0812"} Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.071912 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.984706 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerStarted","Data":"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35"} Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.985112 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-central-agent" containerID="cri-o://79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" gracePeriod=30 Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.985148 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.985356 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="proxy-httpd" containerID="cri-o://37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" gracePeriod=30 Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.985389 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="sg-core" containerID="cri-o://9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" gracePeriod=30 Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.985318 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-notification-agent" containerID="cri-o://bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" gracePeriod=30 Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.994230 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6910128f-6ddf-4edf-86b6-a313f85db70d","Type":"ContainerStarted","Data":"d41f5dab5a359b7729610bc3596f14f0d53c68dd8c4f20656194670903fd905f"} Sep 30 18:03:13 crc kubenswrapper[4797]: I0930 18:03:13.994276 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6910128f-6ddf-4edf-86b6-a313f85db70d","Type":"ContainerStarted","Data":"76e422fa57e7cf0bff1aae46efc2ec4d54b710f10ebb4ec9629cacf5a89e387e"} Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.017551 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.579068843 podStartE2EDuration="13.017533248s" podCreationTimestamp="2025-09-30 18:03:01 +0000 UTC" firstStartedPulling="2025-09-30 18:03:02.132557016 +0000 UTC m=+1232.655056254" lastFinishedPulling="2025-09-30 18:03:13.571021421 +0000 UTC m=+1244.093520659" observedRunningTime="2025-09-30 18:03:14.006289541 +0000 UTC m=+1244.528788779" watchObservedRunningTime="2025-09-30 18:03:14.017533248 +0000 UTC m=+1244.540032486" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.515345 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.624156 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.674149 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvgt9\" (UniqueName: \"kubernetes.io/projected/e0b9e58f-0903-462b-8e48-b8fbaca162d9-kube-api-access-kvgt9\") pod \"e0b9e58f-0903-462b-8e48-b8fbaca162d9\" (UID: \"e0b9e58f-0903-462b-8e48-b8fbaca162d9\") " Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.674828 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.696370 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b9e58f-0903-462b-8e48-b8fbaca162d9-kube-api-access-kvgt9" (OuterVolumeSpecName: "kube-api-access-kvgt9") pod "e0b9e58f-0903-462b-8e48-b8fbaca162d9" (UID: "e0b9e58f-0903-462b-8e48-b8fbaca162d9"). InnerVolumeSpecName "kube-api-access-kvgt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.776207 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hsdm\" (UniqueName: \"kubernetes.io/projected/e516f41b-fb09-421f-9cb2-a10e1a24f02c-kube-api-access-2hsdm\") pod \"e516f41b-fb09-421f-9cb2-a10e1a24f02c\" (UID: \"e516f41b-fb09-421f-9cb2-a10e1a24f02c\") " Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.776700 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvgt9\" (UniqueName: \"kubernetes.io/projected/e0b9e58f-0903-462b-8e48-b8fbaca162d9-kube-api-access-kvgt9\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.780034 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e516f41b-fb09-421f-9cb2-a10e1a24f02c-kube-api-access-2hsdm" (OuterVolumeSpecName: "kube-api-access-2hsdm") pod "e516f41b-fb09-421f-9cb2-a10e1a24f02c" (UID: "e516f41b-fb09-421f-9cb2-a10e1a24f02c"). InnerVolumeSpecName "kube-api-access-2hsdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.877485 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnfdm\" (UniqueName: \"kubernetes.io/projected/b92a15c8-7839-4cad-ab41-953e0d5c44f1-kube-api-access-vnfdm\") pod \"b92a15c8-7839-4cad-ab41-953e0d5c44f1\" (UID: \"b92a15c8-7839-4cad-ab41-953e0d5c44f1\") " Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.877899 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hsdm\" (UniqueName: \"kubernetes.io/projected/e516f41b-fb09-421f-9cb2-a10e1a24f02c-kube-api-access-2hsdm\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.881073 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92a15c8-7839-4cad-ab41-953e0d5c44f1-kube-api-access-vnfdm" (OuterVolumeSpecName: "kube-api-access-vnfdm") pod "b92a15c8-7839-4cad-ab41-953e0d5c44f1" (UID: "b92a15c8-7839-4cad-ab41-953e0d5c44f1"). InnerVolumeSpecName "kube-api-access-vnfdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.952407 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:14 crc kubenswrapper[4797]: I0930 18:03:14.979618 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnfdm\" (UniqueName: \"kubernetes.io/projected/b92a15c8-7839-4cad-ab41-953e0d5c44f1-kube-api-access-vnfdm\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.005699 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mnf4c" event={"ID":"b92a15c8-7839-4cad-ab41-953e0d5c44f1","Type":"ContainerDied","Data":"6705ea344024592056aafda2fe852449d704f9fe75492b28856a50e571724f78"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.005739 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6705ea344024592056aafda2fe852449d704f9fe75492b28856a50e571724f78" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.005791 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mnf4c" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.011897 4797 generic.go:334] "Generic (PLEG): container finished" podID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" exitCode=0 Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.011926 4797 generic.go:334] "Generic (PLEG): container finished" podID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" exitCode=2 Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.011935 4797 generic.go:334] "Generic (PLEG): container finished" podID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" exitCode=0 Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.011942 4797 generic.go:334] "Generic (PLEG): container finished" podID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" exitCode=0 Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.011958 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.011968 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerDied","Data":"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.012080 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerDied","Data":"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.012096 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerDied","Data":"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.012134 4797 scope.go:117] "RemoveContainer" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.012780 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerDied","Data":"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.012813 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b933ea23-2970-4ec2-a390-8da0bb811eca","Type":"ContainerDied","Data":"173fab448e629adbb14e7e0d82eeda316754443424bbe69bc57d287561fdb2c5"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.014528 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6910128f-6ddf-4edf-86b6-a313f85db70d","Type":"ContainerStarted","Data":"2397b59f21d8c9ea59808ccb4d72079213c68489dd9586e9416f648dededf177"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.020737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sqrmz" event={"ID":"e0b9e58f-0903-462b-8e48-b8fbaca162d9","Type":"ContainerDied","Data":"49a4fbeb9034e5fce43fa5bcf03d7c85a2211c2ffce51271f7df4f4dd2de0812"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.020757 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sqrmz" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.020770 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49a4fbeb9034e5fce43fa5bcf03d7c85a2211c2ffce51271f7df4f4dd2de0812" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.033237 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5hxvf" event={"ID":"e516f41b-fb09-421f-9cb2-a10e1a24f02c","Type":"ContainerDied","Data":"21024b000d5f088818d612724f1183630dfdde9062e0fb9f8dd590e7d6bfb66a"} Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.033289 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21024b000d5f088818d612724f1183630dfdde9062e0fb9f8dd590e7d6bfb66a" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.033313 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5hxvf" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.051061 4797 scope.go:117] "RemoveContainer" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.065540 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.065520833 podStartE2EDuration="3.065520833s" podCreationTimestamp="2025-09-30 18:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:15.045379453 +0000 UTC m=+1245.567878691" watchObservedRunningTime="2025-09-30 18:03:15.065520833 +0000 UTC m=+1245.588020071" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.076464 4797 scope.go:117] "RemoveContainer" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.080495 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-config-data\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.080764 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-combined-ca-bundle\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.080870 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-log-httpd\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.080891 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-scripts\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.080967 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-sg-core-conf-yaml\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.080991 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-run-httpd\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.081127 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd6xq\" (UniqueName: \"kubernetes.io/projected/b933ea23-2970-4ec2-a390-8da0bb811eca-kube-api-access-zd6xq\") pod \"b933ea23-2970-4ec2-a390-8da0bb811eca\" (UID: \"b933ea23-2970-4ec2-a390-8da0bb811eca\") " Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.081381 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.081793 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.082306 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.086221 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b933ea23-2970-4ec2-a390-8da0bb811eca-kube-api-access-zd6xq" (OuterVolumeSpecName: "kube-api-access-zd6xq") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "kube-api-access-zd6xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.093524 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-scripts" (OuterVolumeSpecName: "scripts") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.100006 4797 scope.go:117] "RemoveContainer" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.129791 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.154084 4797 scope.go:117] "RemoveContainer" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.154524 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": container with ID starting with 37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35 not found: ID does not exist" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.154560 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35"} err="failed to get container status \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": rpc error: code = NotFound desc = could not find container \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": container with ID starting with 37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.154584 4797 scope.go:117] "RemoveContainer" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.154846 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": container with ID starting with 9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2 not found: ID does not exist" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.154868 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2"} err="failed to get container status \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": rpc error: code = NotFound desc = could not find container \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": container with ID starting with 9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.154960 4797 scope.go:117] "RemoveContainer" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.155156 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": container with ID starting with bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1 not found: ID does not exist" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155181 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1"} err="failed to get container status \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": rpc error: code = NotFound desc = could not find container \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": container with ID starting with bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155198 4797 scope.go:117] "RemoveContainer" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.155367 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": container with ID starting with 79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a not found: ID does not exist" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155388 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a"} err="failed to get container status \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": rpc error: code = NotFound desc = could not find container \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": container with ID starting with 79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155403 4797 scope.go:117] "RemoveContainer" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155665 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35"} err="failed to get container status \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": rpc error: code = NotFound desc = could not find container \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": container with ID starting with 37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155683 4797 scope.go:117] "RemoveContainer" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155944 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2"} err="failed to get container status \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": rpc error: code = NotFound desc = could not find container \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": container with ID starting with 9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.155963 4797 scope.go:117] "RemoveContainer" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.156134 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1"} err="failed to get container status \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": rpc error: code = NotFound desc = could not find container \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": container with ID starting with bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.156152 4797 scope.go:117] "RemoveContainer" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.156724 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a"} err="failed to get container status \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": rpc error: code = NotFound desc = could not find container \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": container with ID starting with 79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.156769 4797 scope.go:117] "RemoveContainer" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.156948 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35"} err="failed to get container status \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": rpc error: code = NotFound desc = could not find container \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": container with ID starting with 37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.156974 4797 scope.go:117] "RemoveContainer" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.157417 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2"} err="failed to get container status \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": rpc error: code = NotFound desc = could not find container \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": container with ID starting with 9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.157471 4797 scope.go:117] "RemoveContainer" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.157717 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1"} err="failed to get container status \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": rpc error: code = NotFound desc = could not find container \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": container with ID starting with bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.157738 4797 scope.go:117] "RemoveContainer" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.157980 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a"} err="failed to get container status \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": rpc error: code = NotFound desc = could not find container \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": container with ID starting with 79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.157998 4797 scope.go:117] "RemoveContainer" containerID="37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.158240 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35"} err="failed to get container status \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": rpc error: code = NotFound desc = could not find container \"37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35\": container with ID starting with 37680650de5c2eaa43de504bf1ceeb4d05e4dbdd64808a3186a32d57be133e35 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.158260 4797 scope.go:117] "RemoveContainer" containerID="9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.158509 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2"} err="failed to get container status \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": rpc error: code = NotFound desc = could not find container \"9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2\": container with ID starting with 9c4eb83814ef8076bd553a665e77d99f256c6fb9561cc3609433fe1a173ac1d2 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.158535 4797 scope.go:117] "RemoveContainer" containerID="bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.158796 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1"} err="failed to get container status \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": rpc error: code = NotFound desc = could not find container \"bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1\": container with ID starting with bb98099b899e15473a82c4238106049dd47d13d4a4f82b7b7291398ac6c070c1 not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.158814 4797 scope.go:117] "RemoveContainer" containerID="79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.159022 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a"} err="failed to get container status \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": rpc error: code = NotFound desc = could not find container \"79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a\": container with ID starting with 79b12c45cada4801dd0cc3bd0c88ecad548d6280097634298255299fadcaed3a not found: ID does not exist" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.173816 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.182957 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.182988 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b933ea23-2970-4ec2-a390-8da0bb811eca-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.182999 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd6xq\" (UniqueName: \"kubernetes.io/projected/b933ea23-2970-4ec2-a390-8da0bb811eca-kube-api-access-zd6xq\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.183010 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.183019 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.201582 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-config-data" (OuterVolumeSpecName: "config-data") pod "b933ea23-2970-4ec2-a390-8da0bb811eca" (UID: "b933ea23-2970-4ec2-a390-8da0bb811eca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.284506 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b933ea23-2970-4ec2-a390-8da0bb811eca-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.351569 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.362928 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381229 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381731 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-notification-agent" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381749 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-notification-agent" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381759 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-central-agent" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381767 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-central-agent" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381789 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92a15c8-7839-4cad-ab41-953e0d5c44f1" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381795 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92a15c8-7839-4cad-ab41-953e0d5c44f1" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381807 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516f41b-fb09-421f-9cb2-a10e1a24f02c" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381813 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516f41b-fb09-421f-9cb2-a10e1a24f02c" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381821 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="sg-core" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381827 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="sg-core" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381839 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b9e58f-0903-462b-8e48-b8fbaca162d9" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381845 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b9e58f-0903-462b-8e48-b8fbaca162d9" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: E0930 18:03:15.381856 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="proxy-httpd" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.381862 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="proxy-httpd" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382031 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="sg-core" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382047 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92a15c8-7839-4cad-ab41-953e0d5c44f1" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382056 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-notification-agent" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382068 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="ceilometer-central-agent" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382078 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516f41b-fb09-421f-9cb2-a10e1a24f02c" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382087 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b9e58f-0903-462b-8e48-b8fbaca162d9" containerName="mariadb-database-create" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.382097 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" containerName="proxy-httpd" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.383704 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.386953 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-scripts\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.387014 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.387051 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-config-data\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.387068 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-log-httpd\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.387103 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-run-httpd\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.387156 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twfx9\" (UniqueName: \"kubernetes.io/projected/425df5bc-4b0d-47f9-814f-4bfa5cdca407-kube-api-access-twfx9\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.387176 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.388892 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.388928 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.392275 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488366 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-config-data\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488405 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-log-httpd\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488454 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-run-httpd\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488506 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twfx9\" (UniqueName: \"kubernetes.io/projected/425df5bc-4b0d-47f9-814f-4bfa5cdca407-kube-api-access-twfx9\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488527 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488578 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-scripts\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.488615 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.489288 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-run-httpd\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.489727 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-log-httpd\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.491706 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.491971 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.492096 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-config-data\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.492947 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-scripts\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.505862 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twfx9\" (UniqueName: \"kubernetes.io/projected/425df5bc-4b0d-47f9-814f-4bfa5cdca407-kube-api-access-twfx9\") pod \"ceilometer-0\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " pod="openstack/ceilometer-0" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.582632 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6676d4ddcd-sxf6l" podUID="04e30fb7-7876-4a90-b887-05b7da2f7746" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Sep 30 18:03:15 crc kubenswrapper[4797]: I0930 18:03:15.748391 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:16 crc kubenswrapper[4797]: I0930 18:03:16.249414 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b933ea23-2970-4ec2-a390-8da0bb811eca" path="/var/lib/kubelet/pods/b933ea23-2970-4ec2-a390-8da0bb811eca/volumes" Sep 30 18:03:16 crc kubenswrapper[4797]: I0930 18:03:16.272080 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:17 crc kubenswrapper[4797]: I0930 18:03:17.073334 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerStarted","Data":"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b"} Sep 30 18:03:17 crc kubenswrapper[4797]: I0930 18:03:17.073593 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerStarted","Data":"8b90df0df6a7747d0457a97d319eae7e561a707fcc026ca8b8c152317b3c4742"} Sep 30 18:03:17 crc kubenswrapper[4797]: I0930 18:03:17.463953 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 18:03:18 crc kubenswrapper[4797]: I0930 18:03:18.087330 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerStarted","Data":"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235"} Sep 30 18:03:19 crc kubenswrapper[4797]: I0930 18:03:19.097337 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerStarted","Data":"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d"} Sep 30 18:03:19 crc kubenswrapper[4797]: I0930 18:03:19.501501 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.146367 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6d23-account-create-fkt29"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.148503 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.151191 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.163065 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6d23-account-create-fkt29"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.300366 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zccjk\" (UniqueName: \"kubernetes.io/projected/6233c517-db8f-4d92-9d2a-486931d3cd14-kube-api-access-zccjk\") pod \"nova-api-6d23-account-create-fkt29\" (UID: \"6233c517-db8f-4d92-9d2a-486931d3cd14\") " pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.309994 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.310256 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-log" containerID="cri-o://64af618ad6e8219fd85aee8724a74aca56c8bfde45b9584e6df25bd31828cb8e" gracePeriod=30 Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.310347 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-httpd" containerID="cri-o://a5f82ab525bcec3ffb150b4bdb370e28f3f4f5b4ea32e797a0d7bbf00e064cfd" gracePeriod=30 Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.364557 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7dbd-account-create-8qlmz"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.365854 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.376831 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.387222 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7dbd-account-create-8qlmz"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.402836 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zccjk\" (UniqueName: \"kubernetes.io/projected/6233c517-db8f-4d92-9d2a-486931d3cd14-kube-api-access-zccjk\") pod \"nova-api-6d23-account-create-fkt29\" (UID: \"6233c517-db8f-4d92-9d2a-486931d3cd14\") " pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.443400 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zccjk\" (UniqueName: \"kubernetes.io/projected/6233c517-db8f-4d92-9d2a-486931d3cd14-kube-api-access-zccjk\") pod \"nova-api-6d23-account-create-fkt29\" (UID: \"6233c517-db8f-4d92-9d2a-486931d3cd14\") " pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.463902 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.506789 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwpv\" (UniqueName: \"kubernetes.io/projected/a6188720-6c09-49d3-b419-9b8de57dd718-kube-api-access-mlwpv\") pod \"nova-cell0-7dbd-account-create-8qlmz\" (UID: \"a6188720-6c09-49d3-b419-9b8de57dd718\") " pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.551505 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5820-account-create-2xmfp"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.556983 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5820-account-create-2xmfp"] Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.557136 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.559929 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.608629 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwpv\" (UniqueName: \"kubernetes.io/projected/a6188720-6c09-49d3-b419-9b8de57dd718-kube-api-access-mlwpv\") pod \"nova-cell0-7dbd-account-create-8qlmz\" (UID: \"a6188720-6c09-49d3-b419-9b8de57dd718\") " pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.608707 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsl4v\" (UniqueName: \"kubernetes.io/projected/ec964b14-f7af-4e27-9b92-6201023d7cc1-kube-api-access-rsl4v\") pod \"nova-cell1-5820-account-create-2xmfp\" (UID: \"ec964b14-f7af-4e27-9b92-6201023d7cc1\") " pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.638371 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwpv\" (UniqueName: \"kubernetes.io/projected/a6188720-6c09-49d3-b419-9b8de57dd718-kube-api-access-mlwpv\") pod \"nova-cell0-7dbd-account-create-8qlmz\" (UID: \"a6188720-6c09-49d3-b419-9b8de57dd718\") " pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.699133 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.709479 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsl4v\" (UniqueName: \"kubernetes.io/projected/ec964b14-f7af-4e27-9b92-6201023d7cc1-kube-api-access-rsl4v\") pod \"nova-cell1-5820-account-create-2xmfp\" (UID: \"ec964b14-f7af-4e27-9b92-6201023d7cc1\") " pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.730413 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsl4v\" (UniqueName: \"kubernetes.io/projected/ec964b14-f7af-4e27-9b92-6201023d7cc1-kube-api-access-rsl4v\") pod \"nova-cell1-5820-account-create-2xmfp\" (UID: \"ec964b14-f7af-4e27-9b92-6201023d7cc1\") " pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:21 crc kubenswrapper[4797]: I0930 18:03:21.887027 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.039827 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6d23-account-create-fkt29"] Sep 30 18:03:22 crc kubenswrapper[4797]: W0930 18:03:22.053792 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6233c517_db8f_4d92_9d2a_486931d3cd14.slice/crio-3d7019115e771b1ec8f4ac67bf7aed4c2a3089d599c39f8ceb4e1d8a2ad7f455 WatchSource:0}: Error finding container 3d7019115e771b1ec8f4ac67bf7aed4c2a3089d599c39f8ceb4e1d8a2ad7f455: Status 404 returned error can't find the container with id 3d7019115e771b1ec8f4ac67bf7aed4c2a3089d599c39f8ceb4e1d8a2ad7f455 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.147659 4797 generic.go:334] "Generic (PLEG): container finished" podID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerID="64af618ad6e8219fd85aee8724a74aca56c8bfde45b9584e6df25bd31828cb8e" exitCode=143 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.147733 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"168d0430-28b2-43f2-a5fe-f8b0c35cec53","Type":"ContainerDied","Data":"64af618ad6e8219fd85aee8724a74aca56c8bfde45b9584e6df25bd31828cb8e"} Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.153997 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d23-account-create-fkt29" event={"ID":"6233c517-db8f-4d92-9d2a-486931d3cd14","Type":"ContainerStarted","Data":"3d7019115e771b1ec8f4ac67bf7aed4c2a3089d599c39f8ceb4e1d8a2ad7f455"} Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.173759 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerStarted","Data":"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787"} Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.173941 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-central-agent" containerID="cri-o://cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" gracePeriod=30 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.174515 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.174768 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="proxy-httpd" containerID="cri-o://e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" gracePeriod=30 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.174815 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="sg-core" containerID="cri-o://a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" gracePeriod=30 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.174851 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-notification-agent" containerID="cri-o://74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" gracePeriod=30 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.192259 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7dbd-account-create-8qlmz"] Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.207490 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.68378698 podStartE2EDuration="7.207473684s" podCreationTimestamp="2025-09-30 18:03:15 +0000 UTC" firstStartedPulling="2025-09-30 18:03:16.270655548 +0000 UTC m=+1246.793154786" lastFinishedPulling="2025-09-30 18:03:20.794342252 +0000 UTC m=+1251.316841490" observedRunningTime="2025-09-30 18:03:22.200622886 +0000 UTC m=+1252.723122124" watchObservedRunningTime="2025-09-30 18:03:22.207473684 +0000 UTC m=+1252.729972922" Sep 30 18:03:22 crc kubenswrapper[4797]: W0930 18:03:22.213057 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6188720_6c09_49d3_b419_9b8de57dd718.slice/crio-ad5469456cf11af0e23ceafd514b4865f02c6f489cbdd6577fdda3778e9bd1f6 WatchSource:0}: Error finding container ad5469456cf11af0e23ceafd514b4865f02c6f489cbdd6577fdda3778e9bd1f6: Status 404 returned error can't find the container with id ad5469456cf11af0e23ceafd514b4865f02c6f489cbdd6577fdda3778e9bd1f6 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.368533 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5820-account-create-2xmfp"] Sep 30 18:03:22 crc kubenswrapper[4797]: W0930 18:03:22.438109 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec964b14_f7af_4e27_9b92_6201023d7cc1.slice/crio-a96050e97255b281f0743f62d2118ebc4a911063042263c6bba863f5da1a47b1 WatchSource:0}: Error finding container a96050e97255b281f0743f62d2118ebc4a911063042263c6bba863f5da1a47b1: Status 404 returned error can't find the container with id a96050e97255b281f0743f62d2118ebc4a911063042263c6bba863f5da1a47b1 Sep 30 18:03:22 crc kubenswrapper[4797]: I0930 18:03:22.761000 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.053862 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.054084 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-log" containerID="cri-o://7b99e91d80a68e7a3b3d90bf839e1e433c55613e00266315b8a324656d021a00" gracePeriod=30 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.054467 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-httpd" containerID="cri-o://7cbf42b793fda441fe7d3a0dd7ca5499f2c3b15021f492441b9ed565fac10351" gracePeriod=30 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.107346 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.184188 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec964b14-f7af-4e27-9b92-6201023d7cc1" containerID="9f5d0cbb47dfc1a5274fa316e9351d543fadcf3408574e33ffbafa96870294c8" exitCode=0 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.184239 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5820-account-create-2xmfp" event={"ID":"ec964b14-f7af-4e27-9b92-6201023d7cc1","Type":"ContainerDied","Data":"9f5d0cbb47dfc1a5274fa316e9351d543fadcf3408574e33ffbafa96870294c8"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.184298 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5820-account-create-2xmfp" event={"ID":"ec964b14-f7af-4e27-9b92-6201023d7cc1","Type":"ContainerStarted","Data":"a96050e97255b281f0743f62d2118ebc4a911063042263c6bba863f5da1a47b1"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.191223 4797 generic.go:334] "Generic (PLEG): container finished" podID="a6188720-6c09-49d3-b419-9b8de57dd718" containerID="3f24f030bf63434f9582048b18eea66b229cf0dba04997ed328d1958c7e7f19f" exitCode=0 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.191290 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" event={"ID":"a6188720-6c09-49d3-b419-9b8de57dd718","Type":"ContainerDied","Data":"3f24f030bf63434f9582048b18eea66b229cf0dba04997ed328d1958c7e7f19f"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.191315 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" event={"ID":"a6188720-6c09-49d3-b419-9b8de57dd718","Type":"ContainerStarted","Data":"ad5469456cf11af0e23ceafd514b4865f02c6f489cbdd6577fdda3778e9bd1f6"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.195140 4797 generic.go:334] "Generic (PLEG): container finished" podID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerID="7b99e91d80a68e7a3b3d90bf839e1e433c55613e00266315b8a324656d021a00" exitCode=143 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.195253 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7aa42c46-daf1-4414-9141-ff067cd3e2a2","Type":"ContainerDied","Data":"7b99e91d80a68e7a3b3d90bf839e1e433c55613e00266315b8a324656d021a00"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.201118 4797 generic.go:334] "Generic (PLEG): container finished" podID="6233c517-db8f-4d92-9d2a-486931d3cd14" containerID="3ae8e0e237f66365ba92f165f32edfdaa8b8f7c8a6cc95abb1bcbb417675a776" exitCode=0 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.201239 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d23-account-create-fkt29" event={"ID":"6233c517-db8f-4d92-9d2a-486931d3cd14","Type":"ContainerDied","Data":"3ae8e0e237f66365ba92f165f32edfdaa8b8f7c8a6cc95abb1bcbb417675a776"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220174 4797 generic.go:334] "Generic (PLEG): container finished" podID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" exitCode=0 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220209 4797 generic.go:334] "Generic (PLEG): container finished" podID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" exitCode=2 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220216 4797 generic.go:334] "Generic (PLEG): container finished" podID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" exitCode=0 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220225 4797 generic.go:334] "Generic (PLEG): container finished" podID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" exitCode=0 Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerDied","Data":"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220270 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerDied","Data":"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220280 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerDied","Data":"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220289 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerDied","Data":"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220302 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425df5bc-4b0d-47f9-814f-4bfa5cdca407","Type":"ContainerDied","Data":"8b90df0df6a7747d0457a97d319eae7e561a707fcc026ca8b8c152317b3c4742"} Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220316 4797 scope.go:117] "RemoveContainer" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.220485 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.241969 4797 scope.go:117] "RemoveContainer" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247033 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-sg-core-conf-yaml\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247074 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-scripts\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247132 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twfx9\" (UniqueName: \"kubernetes.io/projected/425df5bc-4b0d-47f9-814f-4bfa5cdca407-kube-api-access-twfx9\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247248 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-config-data\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247271 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-combined-ca-bundle\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247318 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-log-httpd\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.247356 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-run-httpd\") pod \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\" (UID: \"425df5bc-4b0d-47f9-814f-4bfa5cdca407\") " Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.248652 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.248904 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.253000 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-scripts" (OuterVolumeSpecName: "scripts") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.261710 4797 scope.go:117] "RemoveContainer" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.263649 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425df5bc-4b0d-47f9-814f-4bfa5cdca407-kube-api-access-twfx9" (OuterVolumeSpecName: "kube-api-access-twfx9") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "kube-api-access-twfx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.284798 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.327473 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.328729 4797 scope.go:117] "RemoveContainer" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.349212 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.349240 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.349250 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twfx9\" (UniqueName: \"kubernetes.io/projected/425df5bc-4b0d-47f9-814f-4bfa5cdca407-kube-api-access-twfx9\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.349261 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.349269 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.349277 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425df5bc-4b0d-47f9-814f-4bfa5cdca407-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.356283 4797 scope.go:117] "RemoveContainer" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.360111 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": container with ID starting with e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787 not found: ID does not exist" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.360292 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787"} err="failed to get container status \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": rpc error: code = NotFound desc = could not find container \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": container with ID starting with e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.360342 4797 scope.go:117] "RemoveContainer" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.360811 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": container with ID starting with a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d not found: ID does not exist" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.360838 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d"} err="failed to get container status \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": rpc error: code = NotFound desc = could not find container \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": container with ID starting with a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.360855 4797 scope.go:117] "RemoveContainer" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.361126 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": container with ID starting with 74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235 not found: ID does not exist" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.361158 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235"} err="failed to get container status \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": rpc error: code = NotFound desc = could not find container \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": container with ID starting with 74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.361177 4797 scope.go:117] "RemoveContainer" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.361756 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": container with ID starting with cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b not found: ID does not exist" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.361785 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b"} err="failed to get container status \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": rpc error: code = NotFound desc = could not find container \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": container with ID starting with cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.361803 4797 scope.go:117] "RemoveContainer" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.361980 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787"} err="failed to get container status \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": rpc error: code = NotFound desc = could not find container \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": container with ID starting with e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.361998 4797 scope.go:117] "RemoveContainer" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362164 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d"} err="failed to get container status \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": rpc error: code = NotFound desc = could not find container \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": container with ID starting with a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362181 4797 scope.go:117] "RemoveContainer" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362348 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235"} err="failed to get container status \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": rpc error: code = NotFound desc = could not find container \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": container with ID starting with 74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362372 4797 scope.go:117] "RemoveContainer" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362555 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b"} err="failed to get container status \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": rpc error: code = NotFound desc = could not find container \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": container with ID starting with cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362575 4797 scope.go:117] "RemoveContainer" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362755 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787"} err="failed to get container status \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": rpc error: code = NotFound desc = could not find container \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": container with ID starting with e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362772 4797 scope.go:117] "RemoveContainer" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362937 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d"} err="failed to get container status \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": rpc error: code = NotFound desc = could not find container \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": container with ID starting with a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.362954 4797 scope.go:117] "RemoveContainer" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.363473 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235"} err="failed to get container status \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": rpc error: code = NotFound desc = could not find container \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": container with ID starting with 74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.363499 4797 scope.go:117] "RemoveContainer" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.363712 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b"} err="failed to get container status \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": rpc error: code = NotFound desc = could not find container \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": container with ID starting with cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.363730 4797 scope.go:117] "RemoveContainer" containerID="e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.364213 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787"} err="failed to get container status \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": rpc error: code = NotFound desc = could not find container \"e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787\": container with ID starting with e38d0593b434b49d2b2df3d485a4f0006a8709ceac70dc93373c1878466ed787 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.369955 4797 scope.go:117] "RemoveContainer" containerID="a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.371935 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d"} err="failed to get container status \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": rpc error: code = NotFound desc = could not find container \"a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d\": container with ID starting with a0c8a53a207b3ad6ec0910fc204912ed156a44cece37f024afb6847bfcf27a1d not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.371964 4797 scope.go:117] "RemoveContainer" containerID="74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.372205 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-config-data" (OuterVolumeSpecName: "config-data") pod "425df5bc-4b0d-47f9-814f-4bfa5cdca407" (UID: "425df5bc-4b0d-47f9-814f-4bfa5cdca407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.372960 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235"} err="failed to get container status \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": rpc error: code = NotFound desc = could not find container \"74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235\": container with ID starting with 74f003551c679ada63666296f5c11e3c16f3b5413709a9094a4d36cb472cb235 not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.372988 4797 scope.go:117] "RemoveContainer" containerID="cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.373257 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b"} err="failed to get container status \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": rpc error: code = NotFound desc = could not find container \"cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b\": container with ID starting with cacb00a08a2e55da77bcb86f0c678ea15bce2129181bc0dbba925f731385252b not found: ID does not exist" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.450490 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425df5bc-4b0d-47f9-814f-4bfa5cdca407-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.548178 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.555659 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.570262 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.570667 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="sg-core" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.570684 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="sg-core" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.570701 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-central-agent" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.570707 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-central-agent" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.570721 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="proxy-httpd" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.570727 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="proxy-httpd" Sep 30 18:03:23 crc kubenswrapper[4797]: E0930 18:03:23.570742 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-notification-agent" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.570748 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-notification-agent" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.570971 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="sg-core" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.571001 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="proxy-httpd" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.571015 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-central-agent" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.571023 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" containerName="ceilometer-notification-agent" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.572957 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.584566 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.584787 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.584949 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.755896 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.756264 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-config-data\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.756416 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-log-httpd\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.756477 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-scripts\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.756509 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.756640 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-run-httpd\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.756689 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fmwt\" (UniqueName: \"kubernetes.io/projected/3a3e187e-8a1e-4bcb-8687-3178eea149ef-kube-api-access-5fmwt\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858562 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858620 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-config-data\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-log-httpd\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858749 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-scripts\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858776 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858867 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-run-httpd\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.858913 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fmwt\" (UniqueName: \"kubernetes.io/projected/3a3e187e-8a1e-4bcb-8687-3178eea149ef-kube-api-access-5fmwt\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.859354 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-log-httpd\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.859374 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-run-httpd\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.862215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.862297 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.863391 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-config-data\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.865089 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-scripts\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.875287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fmwt\" (UniqueName: \"kubernetes.io/projected/3a3e187e-8a1e-4bcb-8687-3178eea149ef-kube-api-access-5fmwt\") pod \"ceilometer-0\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " pod="openstack/ceilometer-0" Sep 30 18:03:23 crc kubenswrapper[4797]: I0930 18:03:23.933859 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.286856 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425df5bc-4b0d-47f9-814f-4bfa5cdca407" path="/var/lib/kubelet/pods/425df5bc-4b0d-47f9-814f-4bfa5cdca407/volumes" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.437262 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:24 crc kubenswrapper[4797]: W0930 18:03:24.441628 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a3e187e_8a1e_4bcb_8687_3178eea149ef.slice/crio-c4e417eaca2e0d41a33f126c4470b59fe56919e96ec868282f07a7f4d3dfda10 WatchSource:0}: Error finding container c4e417eaca2e0d41a33f126c4470b59fe56919e96ec868282f07a7f4d3dfda10: Status 404 returned error can't find the container with id c4e417eaca2e0d41a33f126c4470b59fe56919e96ec868282f07a7f4d3dfda10 Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.495315 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.173:9292/healthcheck\": read tcp 10.217.0.2:35038->10.217.0.173:9292: read: connection reset by peer" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.495326 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.173:9292/healthcheck\": read tcp 10.217.0.2:35030->10.217.0.173:9292: read: connection reset by peer" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.598623 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.640838 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.748487 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.772345 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.794677 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsl4v\" (UniqueName: \"kubernetes.io/projected/ec964b14-f7af-4e27-9b92-6201023d7cc1-kube-api-access-rsl4v\") pod \"ec964b14-f7af-4e27-9b92-6201023d7cc1\" (UID: \"ec964b14-f7af-4e27-9b92-6201023d7cc1\") " Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.802936 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec964b14-f7af-4e27-9b92-6201023d7cc1-kube-api-access-rsl4v" (OuterVolumeSpecName: "kube-api-access-rsl4v") pod "ec964b14-f7af-4e27-9b92-6201023d7cc1" (UID: "ec964b14-f7af-4e27-9b92-6201023d7cc1"). InnerVolumeSpecName "kube-api-access-rsl4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.897038 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zccjk\" (UniqueName: \"kubernetes.io/projected/6233c517-db8f-4d92-9d2a-486931d3cd14-kube-api-access-zccjk\") pod \"6233c517-db8f-4d92-9d2a-486931d3cd14\" (UID: \"6233c517-db8f-4d92-9d2a-486931d3cd14\") " Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.897166 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlwpv\" (UniqueName: \"kubernetes.io/projected/a6188720-6c09-49d3-b419-9b8de57dd718-kube-api-access-mlwpv\") pod \"a6188720-6c09-49d3-b419-9b8de57dd718\" (UID: \"a6188720-6c09-49d3-b419-9b8de57dd718\") " Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.898063 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsl4v\" (UniqueName: \"kubernetes.io/projected/ec964b14-f7af-4e27-9b92-6201023d7cc1-kube-api-access-rsl4v\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.906113 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6188720-6c09-49d3-b419-9b8de57dd718-kube-api-access-mlwpv" (OuterVolumeSpecName: "kube-api-access-mlwpv") pod "a6188720-6c09-49d3-b419-9b8de57dd718" (UID: "a6188720-6c09-49d3-b419-9b8de57dd718"). InnerVolumeSpecName "kube-api-access-mlwpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.909323 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6233c517-db8f-4d92-9d2a-486931d3cd14-kube-api-access-zccjk" (OuterVolumeSpecName: "kube-api-access-zccjk") pod "6233c517-db8f-4d92-9d2a-486931d3cd14" (UID: "6233c517-db8f-4d92-9d2a-486931d3cd14"). InnerVolumeSpecName "kube-api-access-zccjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.999705 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zccjk\" (UniqueName: \"kubernetes.io/projected/6233c517-db8f-4d92-9d2a-486931d3cd14-kube-api-access-zccjk\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:24 crc kubenswrapper[4797]: I0930 18:03:24.999735 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlwpv\" (UniqueName: \"kubernetes.io/projected/a6188720-6c09-49d3-b419-9b8de57dd718-kube-api-access-mlwpv\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.257854 4797 generic.go:334] "Generic (PLEG): container finished" podID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerID="a5f82ab525bcec3ffb150b4bdb370e28f3f4f5b4ea32e797a0d7bbf00e064cfd" exitCode=0 Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.257929 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"168d0430-28b2-43f2-a5fe-f8b0c35cec53","Type":"ContainerDied","Data":"a5f82ab525bcec3ffb150b4bdb370e28f3f4f5b4ea32e797a0d7bbf00e064cfd"} Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.259033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerStarted","Data":"c4e417eaca2e0d41a33f126c4470b59fe56919e96ec868282f07a7f4d3dfda10"} Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.260471 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d23-account-create-fkt29" event={"ID":"6233c517-db8f-4d92-9d2a-486931d3cd14","Type":"ContainerDied","Data":"3d7019115e771b1ec8f4ac67bf7aed4c2a3089d599c39f8ceb4e1d8a2ad7f455"} Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.260499 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d7019115e771b1ec8f4ac67bf7aed4c2a3089d599c39f8ceb4e1d8a2ad7f455" Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.260493 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d23-account-create-fkt29" Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.262252 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5820-account-create-2xmfp" event={"ID":"ec964b14-f7af-4e27-9b92-6201023d7cc1","Type":"ContainerDied","Data":"a96050e97255b281f0743f62d2118ebc4a911063042263c6bba863f5da1a47b1"} Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.262275 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a96050e97255b281f0743f62d2118ebc4a911063042263c6bba863f5da1a47b1" Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.262314 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5820-account-create-2xmfp" Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.266991 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" event={"ID":"a6188720-6c09-49d3-b419-9b8de57dd718","Type":"ContainerDied","Data":"ad5469456cf11af0e23ceafd514b4865f02c6f489cbdd6577fdda3778e9bd1f6"} Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.267033 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad5469456cf11af0e23ceafd514b4865f02c6f489cbdd6577fdda3778e9bd1f6" Sep 30 18:03:25 crc kubenswrapper[4797]: I0930 18:03:25.267089 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7dbd-account-create-8qlmz" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.289001 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7aa42c46-daf1-4414-9141-ff067cd3e2a2","Type":"ContainerDied","Data":"7cbf42b793fda441fe7d3a0dd7ca5499f2c3b15021f492441b9ed565fac10351"} Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.291291 4797 generic.go:334] "Generic (PLEG): container finished" podID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerID="7cbf42b793fda441fe7d3a0dd7ca5499f2c3b15021f492441b9ed565fac10351" exitCode=0 Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.294188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerStarted","Data":"5b9dd7fae86ef0be2206f53e3e8716549909735eafa07aa7396578a8b42d8812"} Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.421116 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.524741 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nskfk\" (UniqueName: \"kubernetes.io/projected/168d0430-28b2-43f2-a5fe-f8b0c35cec53-kube-api-access-nskfk\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.524796 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-combined-ca-bundle\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.525521 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-httpd-run\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.525574 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-public-tls-certs\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.525602 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-config-data\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.525635 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.525658 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-logs\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.525693 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-scripts\") pod \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\" (UID: \"168d0430-28b2-43f2-a5fe-f8b0c35cec53\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.526898 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.527113 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-logs" (OuterVolumeSpecName: "logs") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.530992 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168d0430-28b2-43f2-a5fe-f8b0c35cec53-kube-api-access-nskfk" (OuterVolumeSpecName: "kube-api-access-nskfk") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "kube-api-access-nskfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.544801 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.549568 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-scripts" (OuterVolumeSpecName: "scripts") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.585582 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.597496 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ltgsf"] Sep 30 18:03:26 crc kubenswrapper[4797]: E0930 18:03:26.598009 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-httpd" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598034 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-httpd" Sep 30 18:03:26 crc kubenswrapper[4797]: E0930 18:03:26.598067 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec964b14-f7af-4e27-9b92-6201023d7cc1" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598077 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec964b14-f7af-4e27-9b92-6201023d7cc1" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: E0930 18:03:26.598095 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6188720-6c09-49d3-b419-9b8de57dd718" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598103 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6188720-6c09-49d3-b419-9b8de57dd718" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: E0930 18:03:26.598118 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-log" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598127 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-log" Sep 30 18:03:26 crc kubenswrapper[4797]: E0930 18:03:26.598144 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6233c517-db8f-4d92-9d2a-486931d3cd14" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598152 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6233c517-db8f-4d92-9d2a-486931d3cd14" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598423 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec964b14-f7af-4e27-9b92-6201023d7cc1" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598466 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6188720-6c09-49d3-b419-9b8de57dd718" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598484 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-log" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598496 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6233c517-db8f-4d92-9d2a-486931d3cd14" containerName="mariadb-account-create" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.598508 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" containerName="glance-httpd" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.599298 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.604521 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ts8m8" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.604771 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.605037 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.607938 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ltgsf"] Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632402 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-scripts\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632487 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632606 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84glc\" (UniqueName: \"kubernetes.io/projected/d90801fb-fe0a-4517-be34-b1ad52f0029e-kube-api-access-84glc\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632760 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nskfk\" (UniqueName: \"kubernetes.io/projected/168d0430-28b2-43f2-a5fe-f8b0c35cec53-kube-api-access-nskfk\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632780 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632795 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632818 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632831 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/168d0430-28b2-43f2-a5fe-f8b0c35cec53-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.632843 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.646637 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.689661 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.690099 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-config-data" (OuterVolumeSpecName: "config-data") pod "168d0430-28b2-43f2-a5fe-f8b0c35cec53" (UID: "168d0430-28b2-43f2-a5fe-f8b0c35cec53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.734380 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.734471 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84glc\" (UniqueName: \"kubernetes.io/projected/d90801fb-fe0a-4517-be34-b1ad52f0029e-kube-api-access-84glc\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.734706 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-scripts\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.740159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-scripts\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.755495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.755651 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.755924 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.755940 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168d0430-28b2-43f2-a5fe-f8b0c35cec53-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.755951 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.762816 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.769262 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84glc\" (UniqueName: \"kubernetes.io/projected/d90801fb-fe0a-4517-be34-b1ad52f0029e-kube-api-access-84glc\") pod \"nova-cell0-conductor-db-sync-ltgsf\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.873300 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.921118 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.959983 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-config-data\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960032 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-internal-tls-certs\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960128 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960185 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-logs\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960233 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-combined-ca-bundle\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960258 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h927d\" (UniqueName: \"kubernetes.io/projected/7aa42c46-daf1-4414-9141-ff067cd3e2a2-kube-api-access-h927d\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960355 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-httpd-run\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.960403 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-scripts\") pod \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\" (UID: \"7aa42c46-daf1-4414-9141-ff067cd3e2a2\") " Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.961553 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-logs" (OuterVolumeSpecName: "logs") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.964564 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-scripts" (OuterVolumeSpecName: "scripts") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.964777 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.969640 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa42c46-daf1-4414-9141-ff067cd3e2a2-kube-api-access-h927d" (OuterVolumeSpecName: "kube-api-access-h927d") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "kube-api-access-h927d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:26 crc kubenswrapper[4797]: I0930 18:03:26.985055 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.011196 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.045186 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.052549 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-config-data" (OuterVolumeSpecName: "config-data") pod "7aa42c46-daf1-4414-9141-ff067cd3e2a2" (UID: "7aa42c46-daf1-4414-9141-ff067cd3e2a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062622 4797 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062659 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062670 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062678 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062708 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062717 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa42c46-daf1-4414-9141-ff067cd3e2a2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062726 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa42c46-daf1-4414-9141-ff067cd3e2a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.062734 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h927d\" (UniqueName: \"kubernetes.io/projected/7aa42c46-daf1-4414-9141-ff067cd3e2a2-kube-api-access-h927d\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.102155 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.165278 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.326873 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerStarted","Data":"ee29e6878947000c47f020f8ece4e901886e8a298913a397615a841d09e8da59"} Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.333392 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"168d0430-28b2-43f2-a5fe-f8b0c35cec53","Type":"ContainerDied","Data":"1b041671028f619525d8dd798258a7c1492f259f4a760931b41ece6829df093f"} Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.333446 4797 scope.go:117] "RemoveContainer" containerID="a5f82ab525bcec3ffb150b4bdb370e28f3f4f5b4ea32e797a0d7bbf00e064cfd" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.333591 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.356198 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7aa42c46-daf1-4414-9141-ff067cd3e2a2","Type":"ContainerDied","Data":"19f5604cbd2d0b5dba90679db050135f5d08afe79aa0a7697206840466bff82a"} Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.357365 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.400658 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.408386 4797 scope.go:117] "RemoveContainer" containerID="64af618ad6e8219fd85aee8724a74aca56c8bfde45b9584e6df25bd31828cb8e" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.425719 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.444837 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: E0930 18:03:27.445265 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-httpd" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.445284 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-httpd" Sep 30 18:03:27 crc kubenswrapper[4797]: E0930 18:03:27.445305 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-log" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.445313 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-log" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.445520 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-log" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.445542 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" containerName="glance-httpd" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.446870 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.452816 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.453091 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.453158 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n2zcg" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.452819 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.467951 4797 scope.go:117] "RemoveContainer" containerID="7cbf42b793fda441fe7d3a0dd7ca5499f2c3b15021f492441b9ed565fac10351" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.475488 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.482626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-logs\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.482743 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-config-data\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.482827 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.482976 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.483044 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-scripts\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.483077 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lg8k\" (UniqueName: \"kubernetes.io/projected/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-kube-api-access-2lg8k\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.483148 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.483254 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.490757 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.499136 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.521106 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.522812 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.524589 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.525302 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.535407 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.542631 4797 scope.go:117] "RemoveContainer" containerID="7b99e91d80a68e7a3b3d90bf839e1e433c55613e00266315b8a324656d021a00" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.558397 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ltgsf"] Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.584563 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.584795 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-scripts\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585001 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lg8k\" (UniqueName: \"kubernetes.io/projected/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-kube-api-access-2lg8k\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585097 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585191 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585280 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-logs\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-config-data\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585455 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.585832 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.586871 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-logs\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.587062 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.604305 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.604570 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-config-data\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.612927 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-scripts\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.618314 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lg8k\" (UniqueName: \"kubernetes.io/projected/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-kube-api-access-2lg8k\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.620236 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7484ca2f-31fc-4ede-bdcc-2ce25e4d5023-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.628223 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023\") " pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.686821 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bd684cf-1443-4068-8b8f-7b1961474c80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.686862 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bd684cf-1443-4068-8b8f-7b1961474c80-logs\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.686882 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.686912 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.687100 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.687174 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24bbf\" (UniqueName: \"kubernetes.io/projected/2bd684cf-1443-4068-8b8f-7b1961474c80-kube-api-access-24bbf\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.687354 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.687590 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.774203 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789502 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bd684cf-1443-4068-8b8f-7b1961474c80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789528 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bd684cf-1443-4068-8b8f-7b1961474c80-logs\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789552 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789579 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789612 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24bbf\" (UniqueName: \"kubernetes.io/projected/2bd684cf-1443-4068-8b8f-7b1961474c80-kube-api-access-24bbf\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.789673 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.790009 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.790757 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bd684cf-1443-4068-8b8f-7b1961474c80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.791128 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bd684cf-1443-4068-8b8f-7b1961474c80-logs\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.793979 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.794703 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.795052 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.802129 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd684cf-1443-4068-8b8f-7b1961474c80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.811096 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24bbf\" (UniqueName: \"kubernetes.io/projected/2bd684cf-1443-4068-8b8f-7b1961474c80-kube-api-access-24bbf\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.832245 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"2bd684cf-1443-4068-8b8f-7b1961474c80\") " pod="openstack/glance-default-internal-api-0" Sep 30 18:03:27 crc kubenswrapper[4797]: I0930 18:03:27.842554 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.251618 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168d0430-28b2-43f2-a5fe-f8b0c35cec53" path="/var/lib/kubelet/pods/168d0430-28b2-43f2-a5fe-f8b0c35cec53/volumes" Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.253043 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa42c46-daf1-4414-9141-ff067cd3e2a2" path="/var/lib/kubelet/pods/7aa42c46-daf1-4414-9141-ff067cd3e2a2/volumes" Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.331612 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 18:03:28 crc kubenswrapper[4797]: W0930 18:03:28.335529 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7484ca2f_31fc_4ede_bdcc_2ce25e4d5023.slice/crio-82259d4e571708f21b0cbfea94c573a133b6ccebabfa957fe1ca0eb5f8e3b8e0 WatchSource:0}: Error finding container 82259d4e571708f21b0cbfea94c573a133b6ccebabfa957fe1ca0eb5f8e3b8e0: Status 404 returned error can't find the container with id 82259d4e571708f21b0cbfea94c573a133b6ccebabfa957fe1ca0eb5f8e3b8e0 Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.377964 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerStarted","Data":"0b4a481a85d3c7ed8512a3acc645bf3694785efe9387e8b6adedb90a962f4cc1"} Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.379774 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" event={"ID":"d90801fb-fe0a-4517-be34-b1ad52f0029e","Type":"ContainerStarted","Data":"63365fc162416c0964fc0c3b7effac6f52ce145cf3b0fcc22c27810c3eaf9a0a"} Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.381497 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023","Type":"ContainerStarted","Data":"82259d4e571708f21b0cbfea94c573a133b6ccebabfa957fe1ca0eb5f8e3b8e0"} Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.519284 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 18:03:28 crc kubenswrapper[4797]: I0930 18:03:28.922793 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:03:29 crc kubenswrapper[4797]: I0930 18:03:29.430452 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023","Type":"ContainerStarted","Data":"9fcabb6764ee297d0a52ad4801faa366aca089b6780fb7fdb85a9a0f71b53dca"} Sep 30 18:03:29 crc kubenswrapper[4797]: I0930 18:03:29.432032 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2bd684cf-1443-4068-8b8f-7b1961474c80","Type":"ContainerStarted","Data":"63784c01c65920261716facafefb3edd54ce731f1f027e5275fa8b4d7140109b"} Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.446474 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7484ca2f-31fc-4ede-bdcc-2ce25e4d5023","Type":"ContainerStarted","Data":"2fd0f11647ff5d1d636c98b04407e451c36191bc0ed26013e291355cb980411f"} Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.457636 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2bd684cf-1443-4068-8b8f-7b1961474c80","Type":"ContainerStarted","Data":"0d559ddf20e347e38850cc26762d631d86504d5268193021ecb88825aa14e11e"} Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.458011 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2bd684cf-1443-4068-8b8f-7b1961474c80","Type":"ContainerStarted","Data":"6d131e38b83089bc9428537318955bd42bc8fed2ae62a731ca9c74b6305fc8c7"} Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.462785 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerStarted","Data":"1466891f6e258498ed63ff7c280a5a201496a1c5b34361a655ecbbedff8b7ae3"} Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.462914 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-central-agent" containerID="cri-o://5b9dd7fae86ef0be2206f53e3e8716549909735eafa07aa7396578a8b42d8812" gracePeriod=30 Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.462982 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.463014 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-notification-agent" containerID="cri-o://ee29e6878947000c47f020f8ece4e901886e8a298913a397615a841d09e8da59" gracePeriod=30 Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.463013 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="sg-core" containerID="cri-o://0b4a481a85d3c7ed8512a3acc645bf3694785efe9387e8b6adedb90a962f4cc1" gracePeriod=30 Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.463054 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="proxy-httpd" containerID="cri-o://1466891f6e258498ed63ff7c280a5a201496a1c5b34361a655ecbbedff8b7ae3" gracePeriod=30 Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.470337 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.470317009 podStartE2EDuration="3.470317009s" podCreationTimestamp="2025-09-30 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:30.467219034 +0000 UTC m=+1260.989718272" watchObservedRunningTime="2025-09-30 18:03:30.470317009 +0000 UTC m=+1260.992816247" Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.491745 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.491731893 podStartE2EDuration="3.491731893s" podCreationTimestamp="2025-09-30 18:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:30.487239151 +0000 UTC m=+1261.009738389" watchObservedRunningTime="2025-09-30 18:03:30.491731893 +0000 UTC m=+1261.014231131" Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.515821 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7160251 podStartE2EDuration="7.515800491s" podCreationTimestamp="2025-09-30 18:03:23 +0000 UTC" firstStartedPulling="2025-09-30 18:03:24.447489825 +0000 UTC m=+1254.969989063" lastFinishedPulling="2025-09-30 18:03:29.247265216 +0000 UTC m=+1259.769764454" observedRunningTime="2025-09-30 18:03:30.512552111 +0000 UTC m=+1261.035051349" watchObservedRunningTime="2025-09-30 18:03:30.515800491 +0000 UTC m=+1261.038299729" Sep 30 18:03:30 crc kubenswrapper[4797]: I0930 18:03:30.987003 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6676d4ddcd-sxf6l" Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.069398 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-ff7898f76-hfsxf"] Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.069647 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon-log" containerID="cri-o://28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff" gracePeriod=30 Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.070034 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" containerID="cri-o://ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de" gracePeriod=30 Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.478505 4797 generic.go:334] "Generic (PLEG): container finished" podID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerID="1466891f6e258498ed63ff7c280a5a201496a1c5b34361a655ecbbedff8b7ae3" exitCode=0 Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.478806 4797 generic.go:334] "Generic (PLEG): container finished" podID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerID="0b4a481a85d3c7ed8512a3acc645bf3694785efe9387e8b6adedb90a962f4cc1" exitCode=2 Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.478816 4797 generic.go:334] "Generic (PLEG): container finished" podID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerID="ee29e6878947000c47f020f8ece4e901886e8a298913a397615a841d09e8da59" exitCode=0 Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.478591 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerDied","Data":"1466891f6e258498ed63ff7c280a5a201496a1c5b34361a655ecbbedff8b7ae3"} Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.478899 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerDied","Data":"0b4a481a85d3c7ed8512a3acc645bf3694785efe9387e8b6adedb90a962f4cc1"} Sep 30 18:03:31 crc kubenswrapper[4797]: I0930 18:03:31.478913 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerDied","Data":"ee29e6878947000c47f020f8ece4e901886e8a298913a397615a841d09e8da59"} Sep 30 18:03:34 crc kubenswrapper[4797]: I0930 18:03:34.516402 4797 generic.go:334] "Generic (PLEG): container finished" podID="e43627c6-a815-4487-b13d-ff9a402fa860" containerID="ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de" exitCode=0 Sep 30 18:03:34 crc kubenswrapper[4797]: I0930 18:03:34.516474 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff7898f76-hfsxf" event={"ID":"e43627c6-a815-4487-b13d-ff9a402fa860","Type":"ContainerDied","Data":"ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de"} Sep 30 18:03:34 crc kubenswrapper[4797]: I0930 18:03:34.519977 4797 generic.go:334] "Generic (PLEG): container finished" podID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerID="5b9dd7fae86ef0be2206f53e3e8716549909735eafa07aa7396578a8b42d8812" exitCode=0 Sep 30 18:03:34 crc kubenswrapper[4797]: I0930 18:03:34.520014 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerDied","Data":"5b9dd7fae86ef0be2206f53e3e8716549909735eafa07aa7396578a8b42d8812"} Sep 30 18:03:35 crc kubenswrapper[4797]: I0930 18:03:35.411413 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.547285 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a3e187e-8a1e-4bcb-8687-3178eea149ef","Type":"ContainerDied","Data":"c4e417eaca2e0d41a33f126c4470b59fe56919e96ec868282f07a7f4d3dfda10"} Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.547590 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e417eaca2e0d41a33f126c4470b59fe56919e96ec868282f07a7f4d3dfda10" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.558958 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685156 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-config-data\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685423 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fmwt\" (UniqueName: \"kubernetes.io/projected/3a3e187e-8a1e-4bcb-8687-3178eea149ef-kube-api-access-5fmwt\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685545 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-run-httpd\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685616 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-scripts\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685676 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-log-httpd\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685759 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-sg-core-conf-yaml\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.685840 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-combined-ca-bundle\") pod \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\" (UID: \"3a3e187e-8a1e-4bcb-8687-3178eea149ef\") " Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.686104 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.686391 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.687048 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.689218 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3e187e-8a1e-4bcb-8687-3178eea149ef-kube-api-access-5fmwt" (OuterVolumeSpecName: "kube-api-access-5fmwt") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "kube-api-access-5fmwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.693581 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-scripts" (OuterVolumeSpecName: "scripts") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.717161 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.785880 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-config-data" (OuterVolumeSpecName: "config-data") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.788401 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.788622 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fmwt\" (UniqueName: \"kubernetes.io/projected/3a3e187e-8a1e-4bcb-8687-3178eea149ef-kube-api-access-5fmwt\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.788639 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.788648 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a3e187e-8a1e-4bcb-8687-3178eea149ef-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.788656 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.802617 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a3e187e-8a1e-4bcb-8687-3178eea149ef" (UID: "3a3e187e-8a1e-4bcb-8687-3178eea149ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:36 crc kubenswrapper[4797]: I0930 18:03:36.890684 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e187e-8a1e-4bcb-8687-3178eea149ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.557692 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" event={"ID":"d90801fb-fe0a-4517-be34-b1ad52f0029e","Type":"ContainerStarted","Data":"a9632e2d8fcf7cca7c0136d063836f5c19209df1e5727ddb63c33ca9086a247c"} Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.557715 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.582260 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" podStartSLOduration=2.645397148 podStartE2EDuration="11.58224124s" podCreationTimestamp="2025-09-30 18:03:26 +0000 UTC" firstStartedPulling="2025-09-30 18:03:27.559284592 +0000 UTC m=+1258.081783830" lastFinishedPulling="2025-09-30 18:03:36.496128694 +0000 UTC m=+1267.018627922" observedRunningTime="2025-09-30 18:03:37.574024485 +0000 UTC m=+1268.096523723" watchObservedRunningTime="2025-09-30 18:03:37.58224124 +0000 UTC m=+1268.104740478" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.600082 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.619178 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.631097 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:37 crc kubenswrapper[4797]: E0930 18:03:37.631730 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-central-agent" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.631760 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-central-agent" Sep 30 18:03:37 crc kubenswrapper[4797]: E0930 18:03:37.631798 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="proxy-httpd" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.631810 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="proxy-httpd" Sep 30 18:03:37 crc kubenswrapper[4797]: E0930 18:03:37.631837 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-notification-agent" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.631848 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-notification-agent" Sep 30 18:03:37 crc kubenswrapper[4797]: E0930 18:03:37.631880 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="sg-core" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.631890 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="sg-core" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.632242 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-central-agent" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.632274 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="sg-core" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.632295 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="ceilometer-notification-agent" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.632317 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" containerName="proxy-httpd" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.635196 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.637856 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.638610 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.638931 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712143 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712426 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-scripts\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712465 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-config-data\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712504 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfjm\" (UniqueName: \"kubernetes.io/projected/f8b5f877-bbee-454e-9331-530dfc722921-kube-api-access-2xfjm\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712526 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-log-httpd\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712541 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-run-httpd\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.712565 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.775232 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.775294 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.813468 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.814820 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.814895 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-scripts\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.814918 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-config-data\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.814946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfjm\" (UniqueName: \"kubernetes.io/projected/f8b5f877-bbee-454e-9331-530dfc722921-kube-api-access-2xfjm\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.814964 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-log-httpd\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.814981 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-run-httpd\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.815017 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.815641 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-run-httpd\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.816093 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-log-httpd\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.820163 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-scripts\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.820604 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-config-data\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.822578 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.822806 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.827966 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.838069 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfjm\" (UniqueName: \"kubernetes.io/projected/f8b5f877-bbee-454e-9331-530dfc722921-kube-api-access-2xfjm\") pod \"ceilometer-0\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " pod="openstack/ceilometer-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.842740 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.842801 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.881713 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.912063 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:37 crc kubenswrapper[4797]: I0930 18:03:37.967647 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.247573 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3e187e-8a1e-4bcb-8687-3178eea149ef" path="/var/lib/kubelet/pods/3a3e187e-8a1e-4bcb-8687-3178eea149ef/volumes" Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.491007 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:38 crc kubenswrapper[4797]: W0930 18:03:38.494287 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b5f877_bbee_454e_9331_530dfc722921.slice/crio-62ee30eb659c23c026e72e3f11825956d5327fad782918a7f89b9f5c59ae5839 WatchSource:0}: Error finding container 62ee30eb659c23c026e72e3f11825956d5327fad782918a7f89b9f5c59ae5839: Status 404 returned error can't find the container with id 62ee30eb659c23c026e72e3f11825956d5327fad782918a7f89b9f5c59ae5839 Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.567476 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerStarted","Data":"62ee30eb659c23c026e72e3f11825956d5327fad782918a7f89b9f5c59ae5839"} Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.568573 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.568707 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.568808 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 18:03:38 crc kubenswrapper[4797]: I0930 18:03:38.568910 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 18:03:39 crc kubenswrapper[4797]: I0930 18:03:39.576480 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerStarted","Data":"592ba570993cef9acfa989375a15e368922a1c1d2f8bd4524b80f15a393657f9"} Sep 30 18:03:40 crc kubenswrapper[4797]: I0930 18:03:40.457724 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 18:03:40 crc kubenswrapper[4797]: I0930 18:03:40.586894 4797 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:03:40 crc kubenswrapper[4797]: I0930 18:03:40.587719 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerStarted","Data":"710ec4d876a01c70f41d96c08edd8e40a6ca4aa51ca85b1b6c0bdac813c31535"} Sep 30 18:03:40 crc kubenswrapper[4797]: I0930 18:03:40.735098 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:40 crc kubenswrapper[4797]: I0930 18:03:40.735164 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 18:03:40 crc kubenswrapper[4797]: I0930 18:03:40.755673 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 18:03:41 crc kubenswrapper[4797]: I0930 18:03:41.399216 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:03:41 crc kubenswrapper[4797]: I0930 18:03:41.400546 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" containerName="watcher-decision-engine" containerID="cri-o://16815b620a0c0bde09222225a3d16bf86d63b745e21f7334c4cf084e5dd9d911" gracePeriod=30 Sep 30 18:03:41 crc kubenswrapper[4797]: I0930 18:03:41.598660 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerStarted","Data":"6097f2da00d39afd813bfce60bba79e4998f0c7907c8e2fc299c038c31c18a5f"} Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.408843 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.610116 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerStarted","Data":"84fda0e5dc81e1865434ee46de854012c61b7e9abbcc06163aed5055efbca85a"} Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.610270 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-central-agent" containerID="cri-o://592ba570993cef9acfa989375a15e368922a1c1d2f8bd4524b80f15a393657f9" gracePeriod=30 Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.610338 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.610680 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="proxy-httpd" containerID="cri-o://84fda0e5dc81e1865434ee46de854012c61b7e9abbcc06163aed5055efbca85a" gracePeriod=30 Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.610726 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="sg-core" containerID="cri-o://6097f2da00d39afd813bfce60bba79e4998f0c7907c8e2fc299c038c31c18a5f" gracePeriod=30 Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.610760 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-notification-agent" containerID="cri-o://710ec4d876a01c70f41d96c08edd8e40a6ca4aa51ca85b1b6c0bdac813c31535" gracePeriod=30 Sep 30 18:03:42 crc kubenswrapper[4797]: I0930 18:03:42.636637 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.931495637 podStartE2EDuration="5.636615299s" podCreationTimestamp="2025-09-30 18:03:37 +0000 UTC" firstStartedPulling="2025-09-30 18:03:38.504229415 +0000 UTC m=+1269.026728653" lastFinishedPulling="2025-09-30 18:03:42.209349077 +0000 UTC m=+1272.731848315" observedRunningTime="2025-09-30 18:03:42.632080436 +0000 UTC m=+1273.154579674" watchObservedRunningTime="2025-09-30 18:03:42.636615299 +0000 UTC m=+1273.159114537" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.619590 4797 generic.go:334] "Generic (PLEG): container finished" podID="7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" containerID="16815b620a0c0bde09222225a3d16bf86d63b745e21f7334c4cf084e5dd9d911" exitCode=0 Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.619627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b","Type":"ContainerDied","Data":"16815b620a0c0bde09222225a3d16bf86d63b745e21f7334c4cf084e5dd9d911"} Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.620219 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b","Type":"ContainerDied","Data":"9a7b641e62b0a0aac8cd2bfa1272c1da3896255511029c9a6bde3ffbc59a4b6b"} Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.620244 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a7b641e62b0a0aac8cd2bfa1272c1da3896255511029c9a6bde3ffbc59a4b6b" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.639307 4797 generic.go:334] "Generic (PLEG): container finished" podID="f8b5f877-bbee-454e-9331-530dfc722921" containerID="84fda0e5dc81e1865434ee46de854012c61b7e9abbcc06163aed5055efbca85a" exitCode=0 Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.639340 4797 generic.go:334] "Generic (PLEG): container finished" podID="f8b5f877-bbee-454e-9331-530dfc722921" containerID="6097f2da00d39afd813bfce60bba79e4998f0c7907c8e2fc299c038c31c18a5f" exitCode=2 Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.639349 4797 generic.go:334] "Generic (PLEG): container finished" podID="f8b5f877-bbee-454e-9331-530dfc722921" containerID="710ec4d876a01c70f41d96c08edd8e40a6ca4aa51ca85b1b6c0bdac813c31535" exitCode=0 Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.639370 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerDied","Data":"84fda0e5dc81e1865434ee46de854012c61b7e9abbcc06163aed5055efbca85a"} Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.639397 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerDied","Data":"6097f2da00d39afd813bfce60bba79e4998f0c7907c8e2fc299c038c31c18a5f"} Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.639408 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerDied","Data":"710ec4d876a01c70f41d96c08edd8e40a6ca4aa51ca85b1b6c0bdac813c31535"} Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.732769 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.843856 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-logs\") pod \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.843980 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qtc9\" (UniqueName: \"kubernetes.io/projected/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-kube-api-access-4qtc9\") pod \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.844040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-config-data\") pod \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.844090 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-combined-ca-bundle\") pod \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.844138 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-custom-prometheus-ca\") pod \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\" (UID: \"7f9c8682-0d0e-4fca-b9a2-95191f7ec66b\") " Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.844445 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-logs" (OuterVolumeSpecName: "logs") pod "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" (UID: "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.844840 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.849420 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-kube-api-access-4qtc9" (OuterVolumeSpecName: "kube-api-access-4qtc9") pod "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" (UID: "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b"). InnerVolumeSpecName "kube-api-access-4qtc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.879175 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" (UID: "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.909589 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" (UID: "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.936643 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-config-data" (OuterVolumeSpecName: "config-data") pod "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" (UID: "7f9c8682-0d0e-4fca-b9a2-95191f7ec66b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.947001 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qtc9\" (UniqueName: \"kubernetes.io/projected/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-kube-api-access-4qtc9\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.947033 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.947043 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:43 crc kubenswrapper[4797]: I0930 18:03:43.947051 4797 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.647985 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.669069 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.677736 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.721562 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:03:44 crc kubenswrapper[4797]: E0930 18:03:44.722109 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" containerName="watcher-decision-engine" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.722130 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" containerName="watcher-decision-engine" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.722387 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" containerName="watcher-decision-engine" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.723206 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.725615 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.743804 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.874224 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.874300 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.874376 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w667l\" (UniqueName: \"kubernetes.io/projected/103ee950-a749-41ce-be1e-bdfb715bc7ad-kube-api-access-w667l\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.874413 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-config-data\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.874486 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103ee950-a749-41ce-be1e-bdfb715bc7ad-logs\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.976502 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-config-data\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.976645 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103ee950-a749-41ce-be1e-bdfb715bc7ad-logs\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.976842 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.976909 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.977016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w667l\" (UniqueName: \"kubernetes.io/projected/103ee950-a749-41ce-be1e-bdfb715bc7ad-kube-api-access-w667l\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.977179 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103ee950-a749-41ce-be1e-bdfb715bc7ad-logs\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.981953 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.985623 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:44 crc kubenswrapper[4797]: I0930 18:03:44.990458 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103ee950-a749-41ce-be1e-bdfb715bc7ad-config-data\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:45 crc kubenswrapper[4797]: I0930 18:03:45.002865 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w667l\" (UniqueName: \"kubernetes.io/projected/103ee950-a749-41ce-be1e-bdfb715bc7ad-kube-api-access-w667l\") pod \"watcher-decision-engine-0\" (UID: \"103ee950-a749-41ce-be1e-bdfb715bc7ad\") " pod="openstack/watcher-decision-engine-0" Sep 30 18:03:45 crc kubenswrapper[4797]: I0930 18:03:45.046050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:45 crc kubenswrapper[4797]: I0930 18:03:45.410865 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Sep 30 18:03:45 crc kubenswrapper[4797]: I0930 18:03:45.540813 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 18:03:45 crc kubenswrapper[4797]: I0930 18:03:45.660217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"103ee950-a749-41ce-be1e-bdfb715bc7ad","Type":"ContainerStarted","Data":"9e44039a72009062109cf099986162d284b2fb3264f0e9ebde8e635b65e4cd6b"} Sep 30 18:03:46 crc kubenswrapper[4797]: I0930 18:03:46.248540 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9c8682-0d0e-4fca-b9a2-95191f7ec66b" path="/var/lib/kubelet/pods/7f9c8682-0d0e-4fca-b9a2-95191f7ec66b/volumes" Sep 30 18:03:46 crc kubenswrapper[4797]: I0930 18:03:46.673748 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"103ee950-a749-41ce-be1e-bdfb715bc7ad","Type":"ContainerStarted","Data":"aa7002d6c204d3d99de4602fad46ab6fd48403fe50b7c983f040007086de01d2"} Sep 30 18:03:46 crc kubenswrapper[4797]: I0930 18:03:46.704856 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.704829042 podStartE2EDuration="2.704829042s" podCreationTimestamp="2025-09-30 18:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:46.692456064 +0000 UTC m=+1277.214955312" watchObservedRunningTime="2025-09-30 18:03:46.704829042 +0000 UTC m=+1277.227328280" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.683819 4797 generic.go:334] "Generic (PLEG): container finished" podID="f8b5f877-bbee-454e-9331-530dfc722921" containerID="592ba570993cef9acfa989375a15e368922a1c1d2f8bd4524b80f15a393657f9" exitCode=0 Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.683908 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerDied","Data":"592ba570993cef9acfa989375a15e368922a1c1d2f8bd4524b80f15a393657f9"} Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.684353 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b5f877-bbee-454e-9331-530dfc722921","Type":"ContainerDied","Data":"62ee30eb659c23c026e72e3f11825956d5327fad782918a7f89b9f5c59ae5839"} Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.684368 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62ee30eb659c23c026e72e3f11825956d5327fad782918a7f89b9f5c59ae5839" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.782629 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936112 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfjm\" (UniqueName: \"kubernetes.io/projected/f8b5f877-bbee-454e-9331-530dfc722921-kube-api-access-2xfjm\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936170 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-scripts\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936273 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-sg-core-conf-yaml\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936358 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-config-data\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936411 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-log-httpd\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936492 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-run-httpd\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.936594 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-combined-ca-bundle\") pod \"f8b5f877-bbee-454e-9331-530dfc722921\" (UID: \"f8b5f877-bbee-454e-9331-530dfc722921\") " Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.937770 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.938238 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.942213 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b5f877-bbee-454e-9331-530dfc722921-kube-api-access-2xfjm" (OuterVolumeSpecName: "kube-api-access-2xfjm") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "kube-api-access-2xfjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.943676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-scripts" (OuterVolumeSpecName: "scripts") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:47 crc kubenswrapper[4797]: I0930 18:03:47.967146 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.021732 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.039241 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfjm\" (UniqueName: \"kubernetes.io/projected/f8b5f877-bbee-454e-9331-530dfc722921-kube-api-access-2xfjm\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.039369 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.039485 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.039549 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.039611 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b5f877-bbee-454e-9331-530dfc722921-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.039671 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.070287 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-config-data" (OuterVolumeSpecName: "config-data") pod "f8b5f877-bbee-454e-9331-530dfc722921" (UID: "f8b5f877-bbee-454e-9331-530dfc722921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.141834 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5f877-bbee-454e-9331-530dfc722921-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.695161 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.792784 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.803999 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.813325 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:48 crc kubenswrapper[4797]: E0930 18:03:48.814066 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="sg-core" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.814160 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="sg-core" Sep 30 18:03:48 crc kubenswrapper[4797]: E0930 18:03:48.814255 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="proxy-httpd" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.814357 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="proxy-httpd" Sep 30 18:03:48 crc kubenswrapper[4797]: E0930 18:03:48.814455 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-notification-agent" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.814543 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-notification-agent" Sep 30 18:03:48 crc kubenswrapper[4797]: E0930 18:03:48.814630 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-central-agent" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.814703 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-central-agent" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.814986 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="proxy-httpd" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.815095 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-notification-agent" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.815182 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="ceilometer-central-agent" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.815257 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5f877-bbee-454e-9331-530dfc722921" containerName="sg-core" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.817399 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.819909 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.820060 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.822122 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.869589 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-scripts\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.869663 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-run-httpd\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.869766 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.869830 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-config-data\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.869916 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.870029 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8lc\" (UniqueName: \"kubernetes.io/projected/72b870b2-8f3b-4d26-9c77-272986d31c97-kube-api-access-bq8lc\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.870074 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-log-httpd\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971300 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8lc\" (UniqueName: \"kubernetes.io/projected/72b870b2-8f3b-4d26-9c77-272986d31c97-kube-api-access-bq8lc\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971355 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-log-httpd\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-scripts\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971395 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-run-httpd\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971453 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971494 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-config-data\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.971548 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.972234 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-log-httpd\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.972499 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-run-httpd\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.977973 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.978528 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-scripts\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.994248 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.996398 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-config-data\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:48 crc kubenswrapper[4797]: I0930 18:03:48.998706 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8lc\" (UniqueName: \"kubernetes.io/projected/72b870b2-8f3b-4d26-9c77-272986d31c97-kube-api-access-bq8lc\") pod \"ceilometer-0\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " pod="openstack/ceilometer-0" Sep 30 18:03:49 crc kubenswrapper[4797]: I0930 18:03:49.139278 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:03:49 crc kubenswrapper[4797]: W0930 18:03:49.685088 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b870b2_8f3b_4d26_9c77_272986d31c97.slice/crio-2d5976bd4237764207bc3b6b2c8032d34f84bd5feb417cd97523c7438fbd8887 WatchSource:0}: Error finding container 2d5976bd4237764207bc3b6b2c8032d34f84bd5feb417cd97523c7438fbd8887: Status 404 returned error can't find the container with id 2d5976bd4237764207bc3b6b2c8032d34f84bd5feb417cd97523c7438fbd8887 Sep 30 18:03:49 crc kubenswrapper[4797]: I0930 18:03:49.689503 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:03:49 crc kubenswrapper[4797]: I0930 18:03:49.692952 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:03:49 crc kubenswrapper[4797]: I0930 18:03:49.712076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerStarted","Data":"2d5976bd4237764207bc3b6b2c8032d34f84bd5feb417cd97523c7438fbd8887"} Sep 30 18:03:50 crc kubenswrapper[4797]: I0930 18:03:50.260904 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b5f877-bbee-454e-9331-530dfc722921" path="/var/lib/kubelet/pods/f8b5f877-bbee-454e-9331-530dfc722921/volumes" Sep 30 18:03:50 crc kubenswrapper[4797]: I0930 18:03:50.726912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerStarted","Data":"26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878"} Sep 30 18:03:51 crc kubenswrapper[4797]: I0930 18:03:51.736280 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerStarted","Data":"ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390"} Sep 30 18:03:52 crc kubenswrapper[4797]: I0930 18:03:52.749934 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerStarted","Data":"a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373"} Sep 30 18:03:54 crc kubenswrapper[4797]: I0930 18:03:54.782641 4797 generic.go:334] "Generic (PLEG): container finished" podID="d90801fb-fe0a-4517-be34-b1ad52f0029e" containerID="a9632e2d8fcf7cca7c0136d063836f5c19209df1e5727ddb63c33ca9086a247c" exitCode=0 Sep 30 18:03:54 crc kubenswrapper[4797]: I0930 18:03:54.783282 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" event={"ID":"d90801fb-fe0a-4517-be34-b1ad52f0029e","Type":"ContainerDied","Data":"a9632e2d8fcf7cca7c0136d063836f5c19209df1e5727ddb63c33ca9086a247c"} Sep 30 18:03:54 crc kubenswrapper[4797]: I0930 18:03:54.786735 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerStarted","Data":"bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb"} Sep 30 18:03:54 crc kubenswrapper[4797]: I0930 18:03:54.787548 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:03:54 crc kubenswrapper[4797]: I0930 18:03:54.832591 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9753127360000002 podStartE2EDuration="6.83256836s" podCreationTimestamp="2025-09-30 18:03:48 +0000 UTC" firstStartedPulling="2025-09-30 18:03:49.68915149 +0000 UTC m=+1280.211650728" lastFinishedPulling="2025-09-30 18:03:53.546407114 +0000 UTC m=+1284.068906352" observedRunningTime="2025-09-30 18:03:54.820042217 +0000 UTC m=+1285.342541455" watchObservedRunningTime="2025-09-30 18:03:54.83256836 +0000 UTC m=+1285.355067608" Sep 30 18:03:55 crc kubenswrapper[4797]: I0930 18:03:55.047214 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:55 crc kubenswrapper[4797]: I0930 18:03:55.090650 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:55 crc kubenswrapper[4797]: I0930 18:03:55.411114 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-ff7898f76-hfsxf" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Sep 30 18:03:55 crc kubenswrapper[4797]: I0930 18:03:55.411313 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:03:55 crc kubenswrapper[4797]: I0930 18:03:55.803956 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:55 crc kubenswrapper[4797]: I0930 18:03:55.850839 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.244367 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.435401 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-combined-ca-bundle\") pod \"d90801fb-fe0a-4517-be34-b1ad52f0029e\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.435839 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84glc\" (UniqueName: \"kubernetes.io/projected/d90801fb-fe0a-4517-be34-b1ad52f0029e-kube-api-access-84glc\") pod \"d90801fb-fe0a-4517-be34-b1ad52f0029e\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.436112 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-scripts\") pod \"d90801fb-fe0a-4517-be34-b1ad52f0029e\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.436261 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data\") pod \"d90801fb-fe0a-4517-be34-b1ad52f0029e\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.443696 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-scripts" (OuterVolumeSpecName: "scripts") pod "d90801fb-fe0a-4517-be34-b1ad52f0029e" (UID: "d90801fb-fe0a-4517-be34-b1ad52f0029e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.450590 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90801fb-fe0a-4517-be34-b1ad52f0029e-kube-api-access-84glc" (OuterVolumeSpecName: "kube-api-access-84glc") pod "d90801fb-fe0a-4517-be34-b1ad52f0029e" (UID: "d90801fb-fe0a-4517-be34-b1ad52f0029e"). InnerVolumeSpecName "kube-api-access-84glc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:56 crc kubenswrapper[4797]: E0930 18:03:56.468986 4797 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data podName:d90801fb-fe0a-4517-be34-b1ad52f0029e nodeName:}" failed. No retries permitted until 2025-09-30 18:03:56.968947735 +0000 UTC m=+1287.491446973 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data") pod "d90801fb-fe0a-4517-be34-b1ad52f0029e" (UID: "d90801fb-fe0a-4517-be34-b1ad52f0029e") : error deleting /var/lib/kubelet/pods/d90801fb-fe0a-4517-be34-b1ad52f0029e/volume-subpaths: remove /var/lib/kubelet/pods/d90801fb-fe0a-4517-be34-b1ad52f0029e/volume-subpaths: no such file or directory Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.472321 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d90801fb-fe0a-4517-be34-b1ad52f0029e" (UID: "d90801fb-fe0a-4517-be34-b1ad52f0029e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.538510 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.538542 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84glc\" (UniqueName: \"kubernetes.io/projected/d90801fb-fe0a-4517-be34-b1ad52f0029e-kube-api-access-84glc\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.538555 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.814339 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.814521 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ltgsf" event={"ID":"d90801fb-fe0a-4517-be34-b1ad52f0029e","Type":"ContainerDied","Data":"63365fc162416c0964fc0c3b7effac6f52ce145cf3b0fcc22c27810c3eaf9a0a"} Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.814541 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63365fc162416c0964fc0c3b7effac6f52ce145cf3b0fcc22c27810c3eaf9a0a" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.941207 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 18:03:56 crc kubenswrapper[4797]: E0930 18:03:56.941737 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90801fb-fe0a-4517-be34-b1ad52f0029e" containerName="nova-cell0-conductor-db-sync" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.941760 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90801fb-fe0a-4517-be34-b1ad52f0029e" containerName="nova-cell0-conductor-db-sync" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.942200 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90801fb-fe0a-4517-be34-b1ad52f0029e" containerName="nova-cell0-conductor-db-sync" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.942978 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.945548 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d236c-592a-46d3-9ef3-5adc1749c0c9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.945620 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d236c-592a-46d3-9ef3-5adc1749c0c9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.945677 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4786s\" (UniqueName: \"kubernetes.io/projected/627d236c-592a-46d3-9ef3-5adc1749c0c9-kube-api-access-4786s\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:56 crc kubenswrapper[4797]: I0930 18:03:56.971027 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.047281 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data\") pod \"d90801fb-fe0a-4517-be34-b1ad52f0029e\" (UID: \"d90801fb-fe0a-4517-be34-b1ad52f0029e\") " Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.047564 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d236c-592a-46d3-9ef3-5adc1749c0c9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.047619 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d236c-592a-46d3-9ef3-5adc1749c0c9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.047672 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4786s\" (UniqueName: \"kubernetes.io/projected/627d236c-592a-46d3-9ef3-5adc1749c0c9-kube-api-access-4786s\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.051963 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data" (OuterVolumeSpecName: "config-data") pod "d90801fb-fe0a-4517-be34-b1ad52f0029e" (UID: "d90801fb-fe0a-4517-be34-b1ad52f0029e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.052243 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d236c-592a-46d3-9ef3-5adc1749c0c9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.052513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d236c-592a-46d3-9ef3-5adc1749c0c9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.065036 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4786s\" (UniqueName: \"kubernetes.io/projected/627d236c-592a-46d3-9ef3-5adc1749c0c9-kube-api-access-4786s\") pod \"nova-cell0-conductor-0\" (UID: \"627d236c-592a-46d3-9ef3-5adc1749c0c9\") " pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.150296 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90801fb-fe0a-4517-be34-b1ad52f0029e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.262410 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.726130 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 18:03:57 crc kubenswrapper[4797]: I0930 18:03:57.824142 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"627d236c-592a-46d3-9ef3-5adc1749c0c9","Type":"ContainerStarted","Data":"a2d25817f230c6a80b27cf6686bbd0d6f9c00cd0d1c8bada60ed00f9facbfdc9"} Sep 30 18:03:58 crc kubenswrapper[4797]: I0930 18:03:58.837136 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"627d236c-592a-46d3-9ef3-5adc1749c0c9","Type":"ContainerStarted","Data":"0ecbac7f9f21eaeb936636b3193c69d46ee9ea9e2ccea1ae2035f756a6147462"} Sep 30 18:03:58 crc kubenswrapper[4797]: I0930 18:03:58.837733 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 18:03:58 crc kubenswrapper[4797]: I0930 18:03:58.854551 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.85453321 podStartE2EDuration="2.85453321s" podCreationTimestamp="2025-09-30 18:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:03:58.853302176 +0000 UTC m=+1289.375801414" watchObservedRunningTime="2025-09-30 18:03:58.85453321 +0000 UTC m=+1289.377032448" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.522082 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.577866 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svzqx\" (UniqueName: \"kubernetes.io/projected/e43627c6-a815-4487-b13d-ff9a402fa860-kube-api-access-svzqx\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.577968 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-secret-key\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.578034 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-scripts\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.578206 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-combined-ca-bundle\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.578288 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-tls-certs\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.589585 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43627c6-a815-4487-b13d-ff9a402fa860-kube-api-access-svzqx" (OuterVolumeSpecName: "kube-api-access-svzqx") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "kube-api-access-svzqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.589820 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.617782 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.620173 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-scripts" (OuterVolumeSpecName: "scripts") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.644210 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.679896 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e43627c6-a815-4487-b13d-ff9a402fa860-logs\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.679959 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-config-data\") pod \"e43627c6-a815-4487-b13d-ff9a402fa860\" (UID: \"e43627c6-a815-4487-b13d-ff9a402fa860\") " Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.680301 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.680316 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.680328 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svzqx\" (UniqueName: \"kubernetes.io/projected/e43627c6-a815-4487-b13d-ff9a402fa860-kube-api-access-svzqx\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.680341 4797 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e43627c6-a815-4487-b13d-ff9a402fa860-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.680351 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.681127 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43627c6-a815-4487-b13d-ff9a402fa860-logs" (OuterVolumeSpecName: "logs") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.705242 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-config-data" (OuterVolumeSpecName: "config-data") pod "e43627c6-a815-4487-b13d-ff9a402fa860" (UID: "e43627c6-a815-4487-b13d-ff9a402fa860"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.781709 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e43627c6-a815-4487-b13d-ff9a402fa860-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.781760 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e43627c6-a815-4487-b13d-ff9a402fa860-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.877838 4797 generic.go:334] "Generic (PLEG): container finished" podID="e43627c6-a815-4487-b13d-ff9a402fa860" containerID="28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff" exitCode=137 Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.878097 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff7898f76-hfsxf" event={"ID":"e43627c6-a815-4487-b13d-ff9a402fa860","Type":"ContainerDied","Data":"28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff"} Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.878130 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff7898f76-hfsxf" event={"ID":"e43627c6-a815-4487-b13d-ff9a402fa860","Type":"ContainerDied","Data":"c94794a1acd4f98b3642b69d428b6178b9427c0d83ee1e4aab018c545082bf62"} Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.878149 4797 scope.go:117] "RemoveContainer" containerID="ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.878292 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ff7898f76-hfsxf" Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.918486 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-ff7898f76-hfsxf"] Sep 30 18:04:01 crc kubenswrapper[4797]: I0930 18:04:01.928609 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-ff7898f76-hfsxf"] Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.051161 4797 scope.go:117] "RemoveContainer" containerID="28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff" Sep 30 18:04:02 crc kubenswrapper[4797]: E0930 18:04:02.074625 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode43627c6_a815_4487_b13d_ff9a402fa860.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode43627c6_a815_4487_b13d_ff9a402fa860.slice/crio-c94794a1acd4f98b3642b69d428b6178b9427c0d83ee1e4aab018c545082bf62\": RecentStats: unable to find data in memory cache]" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.076948 4797 scope.go:117] "RemoveContainer" containerID="ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de" Sep 30 18:04:02 crc kubenswrapper[4797]: E0930 18:04:02.077468 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de\": container with ID starting with ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de not found: ID does not exist" containerID="ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.077497 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de"} err="failed to get container status \"ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de\": rpc error: code = NotFound desc = could not find container \"ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de\": container with ID starting with ae69c02239ea084c139039f8dffa74bf0dc144cfc27293ae63761020012af7de not found: ID does not exist" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.077521 4797 scope.go:117] "RemoveContainer" containerID="28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff" Sep 30 18:04:02 crc kubenswrapper[4797]: E0930 18:04:02.077801 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff\": container with ID starting with 28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff not found: ID does not exist" containerID="28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.077830 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff"} err="failed to get container status \"28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff\": rpc error: code = NotFound desc = could not find container \"28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff\": container with ID starting with 28eaf105f9ecb751826b4782c3e392a775c8b0c8b2e6093f582a68419b47e9ff not found: ID does not exist" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.252085 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" path="/var/lib/kubelet/pods/e43627c6-a815-4487-b13d-ff9a402fa860/volumes" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.305803 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.767278 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wfh7r"] Sep 30 18:04:02 crc kubenswrapper[4797]: E0930 18:04:02.774857 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.774912 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" Sep 30 18:04:02 crc kubenswrapper[4797]: E0930 18:04:02.774972 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon-log" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.774979 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon-log" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.775675 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon-log" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.775716 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43627c6-a815-4487-b13d-ff9a402fa860" containerName="horizon" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.776702 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.781410 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.781527 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.798661 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfh7r"] Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.805374 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-scripts\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.805471 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmrt\" (UniqueName: \"kubernetes.io/projected/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-kube-api-access-vhmrt\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.805496 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.805516 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-config-data\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.907506 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-scripts\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.907633 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmrt\" (UniqueName: \"kubernetes.io/projected/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-kube-api-access-vhmrt\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.907657 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.907695 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-config-data\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.913452 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.914531 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-config-data\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.922993 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-scripts\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.945585 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmrt\" (UniqueName: \"kubernetes.io/projected/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-kube-api-access-vhmrt\") pod \"nova-cell0-cell-mapping-wfh7r\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.979612 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.981154 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.984204 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.991247 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:02 crc kubenswrapper[4797]: I0930 18:04:02.994199 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.004836 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.006740 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010520 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-config-data\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010598 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4510cc0b-2614-4f66-8d12-e02dc78eec7a-logs\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010626 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010712 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010730 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnvz\" (UniqueName: \"kubernetes.io/projected/bf6f89ef-650a-461e-8a62-fb023d066be8-kube-api-access-jrnvz\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010773 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-config-data\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.010807 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94cbp\" (UniqueName: \"kubernetes.io/projected/4510cc0b-2614-4f66-8d12-e02dc78eec7a-kube-api-access-94cbp\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.023214 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.110178 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.112936 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.113017 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.113035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrnvz\" (UniqueName: \"kubernetes.io/projected/bf6f89ef-650a-461e-8a62-fb023d066be8-kube-api-access-jrnvz\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.113071 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-config-data\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.113103 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94cbp\" (UniqueName: \"kubernetes.io/projected/4510cc0b-2614-4f66-8d12-e02dc78eec7a-kube-api-access-94cbp\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.113173 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-config-data\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.113206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4510cc0b-2614-4f66-8d12-e02dc78eec7a-logs\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.114012 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4510cc0b-2614-4f66-8d12-e02dc78eec7a-logs\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.115033 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.118771 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.124944 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.129674 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.140302 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.146562 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.154007 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-config-data\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.154841 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrnvz\" (UniqueName: \"kubernetes.io/projected/bf6f89ef-650a-461e-8a62-fb023d066be8-kube-api-access-jrnvz\") pod \"nova-scheduler-0\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.164228 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-config-data\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.167376 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94cbp\" (UniqueName: \"kubernetes.io/projected/4510cc0b-2614-4f66-8d12-e02dc78eec7a-kube-api-access-94cbp\") pod \"nova-api-0\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.246310 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.247914 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.251967 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.255513 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-69vds"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.257132 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.279369 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.304940 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-69vds"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.319220 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.319474 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813abf3c-23fa-4141-a10f-ec9e411958ef-logs\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.319504 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7ktw\" (UniqueName: \"kubernetes.io/projected/813abf3c-23fa-4141-a10f-ec9e411958ef-kube-api-access-t7ktw\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.319641 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-config-data\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.359929 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422405 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-config-data\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422545 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422571 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422629 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422692 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlwgh\" (UniqueName: \"kubernetes.io/projected/74964b4b-482a-4d85-a081-7e0625c93056-kube-api-access-vlwgh\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422742 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813abf3c-23fa-4141-a10f-ec9e411958ef-logs\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422779 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7ktw\" (UniqueName: \"kubernetes.io/projected/813abf3c-23fa-4141-a10f-ec9e411958ef-kube-api-access-t7ktw\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422828 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422848 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-config\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422885 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.422907 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvngx\" (UniqueName: \"kubernetes.io/projected/93447e34-8d1f-42de-8345-224b4154ad10-kube-api-access-jvngx\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.426401 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.427211 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813abf3c-23fa-4141-a10f-ec9e411958ef-logs\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.431558 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-config-data\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.455253 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.476205 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7ktw\" (UniqueName: \"kubernetes.io/projected/813abf3c-23fa-4141-a10f-ec9e411958ef-kube-api-access-t7ktw\") pod \"nova-metadata-0\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524682 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524727 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-config\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524754 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524781 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvngx\" (UniqueName: \"kubernetes.io/projected/93447e34-8d1f-42de-8345-224b4154ad10-kube-api-access-jvngx\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524858 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524886 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524932 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.524975 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlwgh\" (UniqueName: \"kubernetes.io/projected/74964b4b-482a-4d85-a081-7e0625c93056-kube-api-access-vlwgh\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.525021 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.525998 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.526833 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.527849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-config\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.528470 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.528680 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.561614 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.601022 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlwgh\" (UniqueName: \"kubernetes.io/projected/74964b4b-482a-4d85-a081-7e0625c93056-kube-api-access-vlwgh\") pod \"dnsmasq-dns-845d6d6f59-69vds\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.601075 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.649039 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.659769 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvngx\" (UniqueName: \"kubernetes.io/projected/93447e34-8d1f-42de-8345-224b4154ad10-kube-api-access-jvngx\") pod \"nova-cell1-novncproxy-0\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.660871 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.790196 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfh7r"] Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.920101 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfh7r" event={"ID":"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb","Type":"ContainerStarted","Data":"9738e49884b06a7a05627d086de454267819b9ec9fbb9300285dbd55004a7aa4"} Sep 30 18:04:03 crc kubenswrapper[4797]: I0930 18:04:03.941995 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.074674 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.254036 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.524231 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.541648 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-69vds"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.592582 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mltht"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.625598 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.631150 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mltht"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.633886 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.634091 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.741824 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.751689 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-scripts\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.751738 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.751824 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8pw\" (UniqueName: \"kubernetes.io/projected/ab000491-3207-401f-bd75-033e8b569622-kube-api-access-lz8pw\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.751878 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-config-data\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.853707 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-scripts\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.853761 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.853856 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8pw\" (UniqueName: \"kubernetes.io/projected/ab000491-3207-401f-bd75-033e8b569622-kube-api-access-lz8pw\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.853919 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-config-data\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.863693 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.869058 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-scripts\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.869359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-config-data\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.888628 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8pw\" (UniqueName: \"kubernetes.io/projected/ab000491-3207-401f-bd75-033e8b569622-kube-api-access-lz8pw\") pod \"nova-cell1-conductor-db-sync-mltht\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.936560 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4510cc0b-2614-4f66-8d12-e02dc78eec7a","Type":"ContainerStarted","Data":"40604753d9b3db2f319eb7375bba1b5eeef3482e0cf0d1cac96502166623c95a"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.938162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"813abf3c-23fa-4141-a10f-ec9e411958ef","Type":"ContainerStarted","Data":"eb19b2082fc65e65001c2dc5c544c33b25eed208c43bff1a26a1cb427b905ae4"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.948506 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfh7r" event={"ID":"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb","Type":"ContainerStarted","Data":"325fb9a9a6a2712d0b38659c12486128d10aea0778f1f423920016f9da1f46e8"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.957382 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf6f89ef-650a-461e-8a62-fb023d066be8","Type":"ContainerStarted","Data":"6f41e41a38f90114e45e6dfa5fb25508a3441983f5722f9832d16d383e377fba"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.958533 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93447e34-8d1f-42de-8345-224b4154ad10","Type":"ContainerStarted","Data":"2a30613434079fe3a0c28e7f7b4a0cd6b6b8c8392fcaa3ed5d1ad677c9a4e66e"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.962571 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" event={"ID":"74964b4b-482a-4d85-a081-7e0625c93056","Type":"ContainerStarted","Data":"8ba8fbaf8a87414db99e619ed9e300ec9d78a3bf57971ae3ca9c2a1eeb3a13b9"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.962606 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" event={"ID":"74964b4b-482a-4d85-a081-7e0625c93056","Type":"ContainerStarted","Data":"1c366c5a89b9eb1dbc04ce08a1ba24393a3682cd24fe0382e8b588523aecfff9"} Sep 30 18:04:04 crc kubenswrapper[4797]: I0930 18:04:04.988272 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wfh7r" podStartSLOduration=2.98825143 podStartE2EDuration="2.98825143s" podCreationTimestamp="2025-09-30 18:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:04.965600082 +0000 UTC m=+1295.488099330" watchObservedRunningTime="2025-09-30 18:04:04.98825143 +0000 UTC m=+1295.510750678" Sep 30 18:04:05 crc kubenswrapper[4797]: I0930 18:04:05.050260 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:05 crc kubenswrapper[4797]: I0930 18:04:05.621811 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mltht"] Sep 30 18:04:05 crc kubenswrapper[4797]: I0930 18:04:05.983600 4797 generic.go:334] "Generic (PLEG): container finished" podID="74964b4b-482a-4d85-a081-7e0625c93056" containerID="8ba8fbaf8a87414db99e619ed9e300ec9d78a3bf57971ae3ca9c2a1eeb3a13b9" exitCode=0 Sep 30 18:04:05 crc kubenswrapper[4797]: I0930 18:04:05.983837 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" event={"ID":"74964b4b-482a-4d85-a081-7e0625c93056","Type":"ContainerDied","Data":"8ba8fbaf8a87414db99e619ed9e300ec9d78a3bf57971ae3ca9c2a1eeb3a13b9"} Sep 30 18:04:06 crc kubenswrapper[4797]: I0930 18:04:06.911502 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:06 crc kubenswrapper[4797]: I0930 18:04:06.919727 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:07 crc kubenswrapper[4797]: I0930 18:04:07.002962 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mltht" event={"ID":"ab000491-3207-401f-bd75-033e8b569622","Type":"ContainerStarted","Data":"19cc452614139c3ba184f9330e65d33b9a95c5bd62762c86268688bc7351aad3"} Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.020510 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93447e34-8d1f-42de-8345-224b4154ad10","Type":"ContainerStarted","Data":"2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7"} Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.020667 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="93447e34-8d1f-42de-8345-224b4154ad10" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7" gracePeriod=30 Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.023821 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mltht" event={"ID":"ab000491-3207-401f-bd75-033e8b569622","Type":"ContainerStarted","Data":"36664ee1695bbc4f1051bc55f5404f12faafc9e456f39e5f5dbb71b3240d0162"} Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.029599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" event={"ID":"74964b4b-482a-4d85-a081-7e0625c93056","Type":"ContainerStarted","Data":"d8ba5334b065be7485c370f94caa6e088fe691fe068ffdd5a3ca62405baf8b7b"} Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.029650 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.032070 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"813abf3c-23fa-4141-a10f-ec9e411958ef","Type":"ContainerStarted","Data":"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629"} Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.033357 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf6f89ef-650a-461e-8a62-fb023d066be8","Type":"ContainerStarted","Data":"d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0"} Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.060770 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.157458149 podStartE2EDuration="5.060746035s" podCreationTimestamp="2025-09-30 18:04:03 +0000 UTC" firstStartedPulling="2025-09-30 18:04:04.733841806 +0000 UTC m=+1295.256341034" lastFinishedPulling="2025-09-30 18:04:07.637129682 +0000 UTC m=+1298.159628920" observedRunningTime="2025-09-30 18:04:08.047253076 +0000 UTC m=+1298.569752324" watchObservedRunningTime="2025-09-30 18:04:08.060746035 +0000 UTC m=+1298.583245273" Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.078129 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mltht" podStartSLOduration=4.078101958 podStartE2EDuration="4.078101958s" podCreationTimestamp="2025-09-30 18:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:08.065211806 +0000 UTC m=+1298.587711044" watchObservedRunningTime="2025-09-30 18:04:08.078101958 +0000 UTC m=+1298.600601196" Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.108010 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.559257332 podStartE2EDuration="6.107994854s" podCreationTimestamp="2025-09-30 18:04:02 +0000 UTC" firstStartedPulling="2025-09-30 18:04:04.090296621 +0000 UTC m=+1294.612795859" lastFinishedPulling="2025-09-30 18:04:07.639034143 +0000 UTC m=+1298.161533381" observedRunningTime="2025-09-30 18:04:08.081606563 +0000 UTC m=+1298.604105821" watchObservedRunningTime="2025-09-30 18:04:08.107994854 +0000 UTC m=+1298.630494092" Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.117038 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" podStartSLOduration=5.117018 podStartE2EDuration="5.117018s" podCreationTimestamp="2025-09-30 18:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:08.103675366 +0000 UTC m=+1298.626174594" watchObservedRunningTime="2025-09-30 18:04:08.117018 +0000 UTC m=+1298.639517238" Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.360983 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 18:04:08 crc kubenswrapper[4797]: I0930 18:04:08.942916 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.045166 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4510cc0b-2614-4f66-8d12-e02dc78eec7a","Type":"ContainerStarted","Data":"f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921"} Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.045220 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4510cc0b-2614-4f66-8d12-e02dc78eec7a","Type":"ContainerStarted","Data":"64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c"} Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.049947 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"813abf3c-23fa-4141-a10f-ec9e411958ef","Type":"ContainerStarted","Data":"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613"} Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.050771 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-log" containerID="cri-o://dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629" gracePeriod=30 Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.050780 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-metadata" containerID="cri-o://def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613" gracePeriod=30 Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.071988 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.7093090220000002 podStartE2EDuration="7.071964346s" podCreationTimestamp="2025-09-30 18:04:02 +0000 UTC" firstStartedPulling="2025-09-30 18:04:04.270933471 +0000 UTC m=+1294.793432709" lastFinishedPulling="2025-09-30 18:04:07.633588755 +0000 UTC m=+1298.156088033" observedRunningTime="2025-09-30 18:04:09.068256275 +0000 UTC m=+1299.590755513" watchObservedRunningTime="2025-09-30 18:04:09.071964346 +0000 UTC m=+1299.594463584" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.089419 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.989136469 podStartE2EDuration="6.089399722s" podCreationTimestamp="2025-09-30 18:04:03 +0000 UTC" firstStartedPulling="2025-09-30 18:04:04.537043254 +0000 UTC m=+1295.059542492" lastFinishedPulling="2025-09-30 18:04:07.637306507 +0000 UTC m=+1298.159805745" observedRunningTime="2025-09-30 18:04:09.087804509 +0000 UTC m=+1299.610303757" watchObservedRunningTime="2025-09-30 18:04:09.089399722 +0000 UTC m=+1299.611898950" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.751687 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.875467 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7ktw\" (UniqueName: \"kubernetes.io/projected/813abf3c-23fa-4141-a10f-ec9e411958ef-kube-api-access-t7ktw\") pod \"813abf3c-23fa-4141-a10f-ec9e411958ef\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.875595 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-combined-ca-bundle\") pod \"813abf3c-23fa-4141-a10f-ec9e411958ef\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.875720 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813abf3c-23fa-4141-a10f-ec9e411958ef-logs\") pod \"813abf3c-23fa-4141-a10f-ec9e411958ef\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.875813 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-config-data\") pod \"813abf3c-23fa-4141-a10f-ec9e411958ef\" (UID: \"813abf3c-23fa-4141-a10f-ec9e411958ef\") " Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.896641 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813abf3c-23fa-4141-a10f-ec9e411958ef-kube-api-access-t7ktw" (OuterVolumeSpecName: "kube-api-access-t7ktw") pod "813abf3c-23fa-4141-a10f-ec9e411958ef" (UID: "813abf3c-23fa-4141-a10f-ec9e411958ef"). InnerVolumeSpecName "kube-api-access-t7ktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.897594 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813abf3c-23fa-4141-a10f-ec9e411958ef-logs" (OuterVolumeSpecName: "logs") pod "813abf3c-23fa-4141-a10f-ec9e411958ef" (UID: "813abf3c-23fa-4141-a10f-ec9e411958ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.918931 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-config-data" (OuterVolumeSpecName: "config-data") pod "813abf3c-23fa-4141-a10f-ec9e411958ef" (UID: "813abf3c-23fa-4141-a10f-ec9e411958ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.939751 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "813abf3c-23fa-4141-a10f-ec9e411958ef" (UID: "813abf3c-23fa-4141-a10f-ec9e411958ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.979538 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.979574 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7ktw\" (UniqueName: \"kubernetes.io/projected/813abf3c-23fa-4141-a10f-ec9e411958ef-kube-api-access-t7ktw\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.979588 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813abf3c-23fa-4141-a10f-ec9e411958ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:09 crc kubenswrapper[4797]: I0930 18:04:09.979603 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813abf3c-23fa-4141-a10f-ec9e411958ef-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.058902 4797 generic.go:334] "Generic (PLEG): container finished" podID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerID="def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613" exitCode=0 Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.058931 4797 generic.go:334] "Generic (PLEG): container finished" podID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerID="dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629" exitCode=143 Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.059833 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.060696 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"813abf3c-23fa-4141-a10f-ec9e411958ef","Type":"ContainerDied","Data":"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613"} Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.060748 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"813abf3c-23fa-4141-a10f-ec9e411958ef","Type":"ContainerDied","Data":"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629"} Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.060759 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"813abf3c-23fa-4141-a10f-ec9e411958ef","Type":"ContainerDied","Data":"eb19b2082fc65e65001c2dc5c544c33b25eed208c43bff1a26a1cb427b905ae4"} Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.060774 4797 scope.go:117] "RemoveContainer" containerID="def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.104271 4797 scope.go:117] "RemoveContainer" containerID="dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.111938 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.142016 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.143849 4797 scope.go:117] "RemoveContainer" containerID="def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613" Sep 30 18:04:10 crc kubenswrapper[4797]: E0930 18:04:10.144363 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613\": container with ID starting with def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613 not found: ID does not exist" containerID="def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.144412 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613"} err="failed to get container status \"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613\": rpc error: code = NotFound desc = could not find container \"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613\": container with ID starting with def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613 not found: ID does not exist" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.144468 4797 scope.go:117] "RemoveContainer" containerID="dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629" Sep 30 18:04:10 crc kubenswrapper[4797]: E0930 18:04:10.144767 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629\": container with ID starting with dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629 not found: ID does not exist" containerID="dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.144807 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629"} err="failed to get container status \"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629\": rpc error: code = NotFound desc = could not find container \"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629\": container with ID starting with dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629 not found: ID does not exist" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.144839 4797 scope.go:117] "RemoveContainer" containerID="def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.145168 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613"} err="failed to get container status \"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613\": rpc error: code = NotFound desc = could not find container \"def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613\": container with ID starting with def1b448f8586a29e4893097b936b2d0ffc50e3bd563680041f7de8677ccb613 not found: ID does not exist" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.145189 4797 scope.go:117] "RemoveContainer" containerID="dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.145388 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629"} err="failed to get container status \"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629\": rpc error: code = NotFound desc = could not find container \"dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629\": container with ID starting with dd4d9598d2bab122a54cd066ac37d60df77865c3d20153faf27984968ced7629 not found: ID does not exist" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.150941 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:10 crc kubenswrapper[4797]: E0930 18:04:10.151712 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-log" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.151806 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-log" Sep 30 18:04:10 crc kubenswrapper[4797]: E0930 18:04:10.151920 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-metadata" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.151996 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-metadata" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.152327 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-log" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.152475 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" containerName="nova-metadata-metadata" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.153920 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.156535 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.160574 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.167315 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.248762 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813abf3c-23fa-4141-a10f-ec9e411958ef" path="/var/lib/kubelet/pods/813abf3c-23fa-4141-a10f-ec9e411958ef/volumes" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.290120 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-logs\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.290291 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-config-data\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.290408 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.290584 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.290630 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pph6s\" (UniqueName: \"kubernetes.io/projected/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-kube-api-access-pph6s\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.392624 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-config-data\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.392724 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.392765 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.392786 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pph6s\" (UniqueName: \"kubernetes.io/projected/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-kube-api-access-pph6s\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.392893 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-logs\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.393222 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-logs\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.399057 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.404215 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-config-data\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.407981 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.413756 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pph6s\" (UniqueName: \"kubernetes.io/projected/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-kube-api-access-pph6s\") pod \"nova-metadata-0\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " pod="openstack/nova-metadata-0" Sep 30 18:04:10 crc kubenswrapper[4797]: I0930 18:04:10.475243 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:11 crc kubenswrapper[4797]: I0930 18:04:11.011544 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:11 crc kubenswrapper[4797]: I0930 18:04:11.069383 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9","Type":"ContainerStarted","Data":"dc782ba11f507b10063e92a0e949a12302cb416963244bed7adeb90741f8f736"} Sep 30 18:04:12 crc kubenswrapper[4797]: I0930 18:04:12.082262 4797 generic.go:334] "Generic (PLEG): container finished" podID="81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" containerID="325fb9a9a6a2712d0b38659c12486128d10aea0778f1f423920016f9da1f46e8" exitCode=0 Sep 30 18:04:12 crc kubenswrapper[4797]: I0930 18:04:12.082315 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfh7r" event={"ID":"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb","Type":"ContainerDied","Data":"325fb9a9a6a2712d0b38659c12486128d10aea0778f1f423920016f9da1f46e8"} Sep 30 18:04:12 crc kubenswrapper[4797]: I0930 18:04:12.084708 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9","Type":"ContainerStarted","Data":"371c0f21c89a332692b7c6d4c80843f0d911011c478e7cce01bbb21a138cf6f2"} Sep 30 18:04:12 crc kubenswrapper[4797]: I0930 18:04:12.084749 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9","Type":"ContainerStarted","Data":"5e059c39fc6e01fedf0eec75b279fe94bc6a1def69beae0d6b999943344c1583"} Sep 30 18:04:12 crc kubenswrapper[4797]: I0930 18:04:12.130085 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.130065547 podStartE2EDuration="2.130065547s" podCreationTimestamp="2025-09-30 18:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:12.121993377 +0000 UTC m=+1302.644492625" watchObservedRunningTime="2025-09-30 18:04:12.130065547 +0000 UTC m=+1302.652564785" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.360817 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.392168 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.427093 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.427161 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.502759 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.656247 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-scripts\") pod \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.656709 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-combined-ca-bundle\") pod \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.656831 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmrt\" (UniqueName: \"kubernetes.io/projected/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-kube-api-access-vhmrt\") pod \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.656920 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-config-data\") pod \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\" (UID: \"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb\") " Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.663338 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-kube-api-access-vhmrt" (OuterVolumeSpecName: "kube-api-access-vhmrt") pod "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" (UID: "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb"). InnerVolumeSpecName "kube-api-access-vhmrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.663797 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.665864 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-scripts" (OuterVolumeSpecName: "scripts") pod "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" (UID: "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.697625 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-config-data" (OuterVolumeSpecName: "config-data") pod "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" (UID: "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.741743 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-msj2h"] Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.742016 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerName="dnsmasq-dns" containerID="cri-o://ee2aa224c3f2eacb283380e32219de391b77d94d5a09e4cf0deee6f7cc10fafa" gracePeriod=10 Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.759145 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmrt\" (UniqueName: \"kubernetes.io/projected/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-kube-api-access-vhmrt\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.759181 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.759193 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.789551 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" (UID: "81d1bea4-d095-4542-8a3d-7fdffe9ab4bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:13 crc kubenswrapper[4797]: I0930 18:04:13.861707 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.106324 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfh7r" event={"ID":"81d1bea4-d095-4542-8a3d-7fdffe9ab4bb","Type":"ContainerDied","Data":"9738e49884b06a7a05627d086de454267819b9ec9fbb9300285dbd55004a7aa4"} Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.106367 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9738e49884b06a7a05627d086de454267819b9ec9fbb9300285dbd55004a7aa4" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.106493 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfh7r" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.120820 4797 generic.go:334] "Generic (PLEG): container finished" podID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerID="ee2aa224c3f2eacb283380e32219de391b77d94d5a09e4cf0deee6f7cc10fafa" exitCode=0 Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.120866 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" event={"ID":"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2","Type":"ContainerDied","Data":"ee2aa224c3f2eacb283380e32219de391b77d94d5a09e4cf0deee6f7cc10fafa"} Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.167570 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.176399 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.196254 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.196299 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.260203 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.260408 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-log" containerID="cri-o://f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921" gracePeriod=30 Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.261065 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-api" containerID="cri-o://64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c" gracePeriod=30 Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.269684 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-sb\") pod \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.269835 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-swift-storage-0\") pod \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.269892 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-nb\") pod \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.269909 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-svc\") pod \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.269952 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-config\") pod \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.269981 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rf6r\" (UniqueName: \"kubernetes.io/projected/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-kube-api-access-2rf6r\") pod \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\" (UID: \"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2\") " Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.271041 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": EOF" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.275243 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": EOF" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.281393 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-kube-api-access-2rf6r" (OuterVolumeSpecName: "kube-api-access-2rf6r") pod "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" (UID: "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2"). InnerVolumeSpecName "kube-api-access-2rf6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.314879 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.315111 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-log" containerID="cri-o://5e059c39fc6e01fedf0eec75b279fe94bc6a1def69beae0d6b999943344c1583" gracePeriod=30 Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.315570 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-metadata" containerID="cri-o://371c0f21c89a332692b7c6d4c80843f0d911011c478e7cce01bbb21a138cf6f2" gracePeriod=30 Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.377307 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" (UID: "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.378788 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.378816 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rf6r\" (UniqueName: \"kubernetes.io/projected/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-kube-api-access-2rf6r\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.424258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" (UID: "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.431711 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" (UID: "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.460912 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" (UID: "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.480619 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.480673 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.480686 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.497908 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-config" (OuterVolumeSpecName: "config") pod "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" (UID: "4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.582460 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:14 crc kubenswrapper[4797]: I0930 18:04:14.866187 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.144092 4797 generic.go:334] "Generic (PLEG): container finished" podID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerID="371c0f21c89a332692b7c6d4c80843f0d911011c478e7cce01bbb21a138cf6f2" exitCode=0 Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.144122 4797 generic.go:334] "Generic (PLEG): container finished" podID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerID="5e059c39fc6e01fedf0eec75b279fe94bc6a1def69beae0d6b999943344c1583" exitCode=143 Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.144164 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9","Type":"ContainerDied","Data":"371c0f21c89a332692b7c6d4c80843f0d911011c478e7cce01bbb21a138cf6f2"} Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.144187 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9","Type":"ContainerDied","Data":"5e059c39fc6e01fedf0eec75b279fe94bc6a1def69beae0d6b999943344c1583"} Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.148111 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" event={"ID":"4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2","Type":"ContainerDied","Data":"459f0e5663e3f327c89e348a0c4b44d969d477db0ceed66225dbfcb3d768f577"} Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.148142 4797 scope.go:117] "RemoveContainer" containerID="ee2aa224c3f2eacb283380e32219de391b77d94d5a09e4cf0deee6f7cc10fafa" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.148254 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-msj2h" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.154934 4797 generic.go:334] "Generic (PLEG): container finished" podID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerID="f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921" exitCode=143 Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.155702 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4510cc0b-2614-4f66-8d12-e02dc78eec7a","Type":"ContainerDied","Data":"f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921"} Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.187858 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-msj2h"] Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.190626 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.193856 4797 scope.go:117] "RemoveContainer" containerID="09c896db5122ba4e8b7d0f86c00eeeb562dc328690f35b63b8d79a09ea36c3ba" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.198387 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-msj2h"] Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.307179 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-nova-metadata-tls-certs\") pod \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.307258 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-config-data\") pod \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.307331 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-combined-ca-bundle\") pod \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.307458 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pph6s\" (UniqueName: \"kubernetes.io/projected/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-kube-api-access-pph6s\") pod \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.307646 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-logs\") pod \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\" (UID: \"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9\") " Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.311035 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-logs" (OuterVolumeSpecName: "logs") pod "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" (UID: "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.314424 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-kube-api-access-pph6s" (OuterVolumeSpecName: "kube-api-access-pph6s") pod "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" (UID: "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9"). InnerVolumeSpecName "kube-api-access-pph6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.335594 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" (UID: "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.347542 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-config-data" (OuterVolumeSpecName: "config-data") pod "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" (UID: "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.379423 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" (UID: "e3b9d562-1acf-4ac3-817c-4eb4e51b97f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.411879 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.411908 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.411920 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.411929 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:15 crc kubenswrapper[4797]: I0930 18:04:15.411938 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pph6s\" (UniqueName: \"kubernetes.io/projected/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9-kube-api-access-pph6s\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.188814 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3b9d562-1acf-4ac3-817c-4eb4e51b97f9","Type":"ContainerDied","Data":"dc782ba11f507b10063e92a0e949a12302cb416963244bed7adeb90741f8f736"} Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.189295 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.189237 4797 scope.go:117] "RemoveContainer" containerID="371c0f21c89a332692b7c6d4c80843f0d911011c478e7cce01bbb21a138cf6f2" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.204712 4797 generic.go:334] "Generic (PLEG): container finished" podID="ab000491-3207-401f-bd75-033e8b569622" containerID="36664ee1695bbc4f1051bc55f5404f12faafc9e456f39e5f5dbb71b3240d0162" exitCode=0 Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.204812 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mltht" event={"ID":"ab000491-3207-401f-bd75-033e8b569622","Type":"ContainerDied","Data":"36664ee1695bbc4f1051bc55f5404f12faafc9e456f39e5f5dbb71b3240d0162"} Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.205251 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bf6f89ef-650a-461e-8a62-fb023d066be8" containerName="nova-scheduler-scheduler" containerID="cri-o://d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" gracePeriod=30 Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.223895 4797 scope.go:117] "RemoveContainer" containerID="5e059c39fc6e01fedf0eec75b279fe94bc6a1def69beae0d6b999943344c1583" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.279926 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" path="/var/lib/kubelet/pods/4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2/volumes" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.282781 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.282816 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.288524 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:16 crc kubenswrapper[4797]: E0930 18:04:16.288998 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" containerName="nova-manage" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289020 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" containerName="nova-manage" Sep 30 18:04:16 crc kubenswrapper[4797]: E0930 18:04:16.289037 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerName="init" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289046 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerName="init" Sep 30 18:04:16 crc kubenswrapper[4797]: E0930 18:04:16.289078 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-log" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289087 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-log" Sep 30 18:04:16 crc kubenswrapper[4797]: E0930 18:04:16.289102 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerName="dnsmasq-dns" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289110 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerName="dnsmasq-dns" Sep 30 18:04:16 crc kubenswrapper[4797]: E0930 18:04:16.289136 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-metadata" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289144 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-metadata" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289392 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" containerName="nova-manage" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289422 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b30dbf3-9b15-4629-b25a-e9b9a6b0b7d2" containerName="dnsmasq-dns" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289490 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-log" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.289506 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" containerName="nova-metadata-metadata" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.292011 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.295472 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.295696 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.300736 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.437398 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.437531 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.437867 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41bcd62-8f3f-4510-b271-be662ca71353-logs\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.437957 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-config-data\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.437991 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwq89\" (UniqueName: \"kubernetes.io/projected/c41bcd62-8f3f-4510-b271-be662ca71353-kube-api-access-bwq89\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.540093 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.540902 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.541029 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41bcd62-8f3f-4510-b271-be662ca71353-logs\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.541131 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-config-data\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.541165 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwq89\" (UniqueName: \"kubernetes.io/projected/c41bcd62-8f3f-4510-b271-be662ca71353-kube-api-access-bwq89\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.541690 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41bcd62-8f3f-4510-b271-be662ca71353-logs\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.545207 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.551652 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-config-data\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.556412 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.559045 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwq89\" (UniqueName: \"kubernetes.io/projected/c41bcd62-8f3f-4510-b271-be662ca71353-kube-api-access-bwq89\") pod \"nova-metadata-0\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " pod="openstack/nova-metadata-0" Sep 30 18:04:16 crc kubenswrapper[4797]: I0930 18:04:16.633653 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.084493 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.211524 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.225520 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c41bcd62-8f3f-4510-b271-be662ca71353","Type":"ContainerStarted","Data":"04042c4284c1bb3b27549d2efb41c3e0339d9bdea98ee1f711a5607197cfec57"} Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.227053 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mltht" event={"ID":"ab000491-3207-401f-bd75-033e8b569622","Type":"ContainerDied","Data":"19cc452614139c3ba184f9330e65d33b9a95c5bd62762c86268688bc7351aad3"} Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.227079 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19cc452614139c3ba184f9330e65d33b9a95c5bd62762c86268688bc7351aad3" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.227106 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mltht" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.248698 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b9d562-1acf-4ac3-817c-4eb4e51b97f9" path="/var/lib/kubelet/pods/e3b9d562-1acf-4ac3-817c-4eb4e51b97f9/volumes" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.376981 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 18:04:18 crc kubenswrapper[4797]: E0930 18:04:18.377507 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab000491-3207-401f-bd75-033e8b569622" containerName="nova-cell1-conductor-db-sync" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.377528 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab000491-3207-401f-bd75-033e8b569622" containerName="nova-cell1-conductor-db-sync" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.377823 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab000491-3207-401f-bd75-033e8b569622" containerName="nova-cell1-conductor-db-sync" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.384400 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-scripts\") pod \"ab000491-3207-401f-bd75-033e8b569622\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.386416 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz8pw\" (UniqueName: \"kubernetes.io/projected/ab000491-3207-401f-bd75-033e8b569622-kube-api-access-lz8pw\") pod \"ab000491-3207-401f-bd75-033e8b569622\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.386744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-combined-ca-bundle\") pod \"ab000491-3207-401f-bd75-033e8b569622\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.386925 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-config-data\") pod \"ab000491-3207-401f-bd75-033e8b569622\" (UID: \"ab000491-3207-401f-bd75-033e8b569622\") " Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.387402 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.389472 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-scripts" (OuterVolumeSpecName: "scripts") pod "ab000491-3207-401f-bd75-033e8b569622" (UID: "ab000491-3207-401f-bd75-033e8b569622"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.401382 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 18:04:18 crc kubenswrapper[4797]: E0930 18:04:18.405240 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.406077 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab000491-3207-401f-bd75-033e8b569622-kube-api-access-lz8pw" (OuterVolumeSpecName: "kube-api-access-lz8pw") pod "ab000491-3207-401f-bd75-033e8b569622" (UID: "ab000491-3207-401f-bd75-033e8b569622"). InnerVolumeSpecName "kube-api-access-lz8pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:18 crc kubenswrapper[4797]: E0930 18:04:18.433593 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 18:04:18 crc kubenswrapper[4797]: E0930 18:04:18.449844 4797 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 18:04:18 crc kubenswrapper[4797]: E0930 18:04:18.449915 4797 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bf6f89ef-650a-461e-8a62-fb023d066be8" containerName="nova-scheduler-scheduler" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.450352 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-config-data" (OuterVolumeSpecName: "config-data") pod "ab000491-3207-401f-bd75-033e8b569622" (UID: "ab000491-3207-401f-bd75-033e8b569622"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.475639 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab000491-3207-401f-bd75-033e8b569622" (UID: "ab000491-3207-401f-bd75-033e8b569622"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.489777 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549d2ac6-4a20-4698-ad97-b4a94dab16e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.490003 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549d2ac6-4a20-4698-ad97-b4a94dab16e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.490115 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2d5\" (UniqueName: \"kubernetes.io/projected/549d2ac6-4a20-4698-ad97-b4a94dab16e0-kube-api-access-jc2d5\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.490377 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.490534 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.490606 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab000491-3207-401f-bd75-033e8b569622-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.490654 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz8pw\" (UniqueName: \"kubernetes.io/projected/ab000491-3207-401f-bd75-033e8b569622-kube-api-access-lz8pw\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.592771 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549d2ac6-4a20-4698-ad97-b4a94dab16e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.593040 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549d2ac6-4a20-4698-ad97-b4a94dab16e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.593065 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2d5\" (UniqueName: \"kubernetes.io/projected/549d2ac6-4a20-4698-ad97-b4a94dab16e0-kube-api-access-jc2d5\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.596205 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549d2ac6-4a20-4698-ad97-b4a94dab16e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.596287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549d2ac6-4a20-4698-ad97-b4a94dab16e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.608176 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2d5\" (UniqueName: \"kubernetes.io/projected/549d2ac6-4a20-4698-ad97-b4a94dab16e0-kube-api-access-jc2d5\") pod \"nova-cell1-conductor-0\" (UID: \"549d2ac6-4a20-4698-ad97-b4a94dab16e0\") " pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:18 crc kubenswrapper[4797]: I0930 18:04:18.798422 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:19 crc kubenswrapper[4797]: I0930 18:04:19.147655 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 18:04:19 crc kubenswrapper[4797]: I0930 18:04:19.237988 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c41bcd62-8f3f-4510-b271-be662ca71353","Type":"ContainerStarted","Data":"5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9"} Sep 30 18:04:19 crc kubenswrapper[4797]: I0930 18:04:19.238023 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c41bcd62-8f3f-4510-b271-be662ca71353","Type":"ContainerStarted","Data":"9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311"} Sep 30 18:04:19 crc kubenswrapper[4797]: I0930 18:04:19.256931 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.256911315 podStartE2EDuration="3.256911315s" podCreationTimestamp="2025-09-30 18:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:19.254709385 +0000 UTC m=+1309.777208643" watchObservedRunningTime="2025-09-30 18:04:19.256911315 +0000 UTC m=+1309.779410573" Sep 30 18:04:19 crc kubenswrapper[4797]: I0930 18:04:19.306412 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 18:04:19 crc kubenswrapper[4797]: I0930 18:04:19.944651 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.027496 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrnvz\" (UniqueName: \"kubernetes.io/projected/bf6f89ef-650a-461e-8a62-fb023d066be8-kube-api-access-jrnvz\") pod \"bf6f89ef-650a-461e-8a62-fb023d066be8\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.027642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-config-data\") pod \"bf6f89ef-650a-461e-8a62-fb023d066be8\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.027715 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-combined-ca-bundle\") pod \"bf6f89ef-650a-461e-8a62-fb023d066be8\" (UID: \"bf6f89ef-650a-461e-8a62-fb023d066be8\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.045991 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6f89ef-650a-461e-8a62-fb023d066be8-kube-api-access-jrnvz" (OuterVolumeSpecName: "kube-api-access-jrnvz") pod "bf6f89ef-650a-461e-8a62-fb023d066be8" (UID: "bf6f89ef-650a-461e-8a62-fb023d066be8"). InnerVolumeSpecName "kube-api-access-jrnvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.065816 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-config-data" (OuterVolumeSpecName: "config-data") pod "bf6f89ef-650a-461e-8a62-fb023d066be8" (UID: "bf6f89ef-650a-461e-8a62-fb023d066be8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.069805 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf6f89ef-650a-461e-8a62-fb023d066be8" (UID: "bf6f89ef-650a-461e-8a62-fb023d066be8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.139221 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.139258 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6f89ef-650a-461e-8a62-fb023d066be8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.139275 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrnvz\" (UniqueName: \"kubernetes.io/projected/bf6f89ef-650a-461e-8a62-fb023d066be8-kube-api-access-jrnvz\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.188393 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.253591 4797 generic.go:334] "Generic (PLEG): container finished" podID="bf6f89ef-650a-461e-8a62-fb023d066be8" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" exitCode=0 Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.253713 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.254402 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf6f89ef-650a-461e-8a62-fb023d066be8","Type":"ContainerDied","Data":"d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0"} Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.254467 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf6f89ef-650a-461e-8a62-fb023d066be8","Type":"ContainerDied","Data":"6f41e41a38f90114e45e6dfa5fb25508a3441983f5722f9832d16d383e377fba"} Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.254489 4797 scope.go:117] "RemoveContainer" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.257084 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"549d2ac6-4a20-4698-ad97-b4a94dab16e0","Type":"ContainerStarted","Data":"95218c23fe0cf8f70bbe0bd9ceb99c377a71fc846e651ed0c233164359daaeb3"} Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.257119 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"549d2ac6-4a20-4698-ad97-b4a94dab16e0","Type":"ContainerStarted","Data":"fe2416bbb1d32468f667b9c1cff78f312f8ba691edba6043aa514e39c8c1e368"} Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.257337 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.260047 4797 generic.go:334] "Generic (PLEG): container finished" podID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerID="64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c" exitCode=0 Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.260155 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4510cc0b-2614-4f66-8d12-e02dc78eec7a","Type":"ContainerDied","Data":"64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c"} Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.260186 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4510cc0b-2614-4f66-8d12-e02dc78eec7a","Type":"ContainerDied","Data":"40604753d9b3db2f319eb7375bba1b5eeef3482e0cf0d1cac96502166623c95a"} Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.260304 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.290374 4797 scope.go:117] "RemoveContainer" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" Sep 30 18:04:20 crc kubenswrapper[4797]: E0930 18:04:20.290898 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0\": container with ID starting with d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0 not found: ID does not exist" containerID="d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.290950 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0"} err="failed to get container status \"d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0\": rpc error: code = NotFound desc = could not find container \"d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0\": container with ID starting with d3c49ecd91af8d23344882726bfb3931854b4034a12752b124e85d04cfd34ee0 not found: ID does not exist" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.290982 4797 scope.go:117] "RemoveContainer" containerID="64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.303884 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.303853261 podStartE2EDuration="2.303853261s" podCreationTimestamp="2025-09-30 18:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:20.301406724 +0000 UTC m=+1310.823905972" watchObservedRunningTime="2025-09-30 18:04:20.303853261 +0000 UTC m=+1310.826352519" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.313040 4797 scope.go:117] "RemoveContainer" containerID="f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.325610 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.336448 4797 scope.go:117] "RemoveContainer" containerID="64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c" Sep 30 18:04:20 crc kubenswrapper[4797]: E0930 18:04:20.336895 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c\": container with ID starting with 64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c not found: ID does not exist" containerID="64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.336946 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c"} err="failed to get container status \"64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c\": rpc error: code = NotFound desc = could not find container \"64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c\": container with ID starting with 64f6a4ee4289716847806d93b3b9546813b6ae4ad13fce19b890d01c13aa0b9c not found: ID does not exist" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.336981 4797 scope.go:117] "RemoveContainer" containerID="f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921" Sep 30 18:04:20 crc kubenswrapper[4797]: E0930 18:04:20.337644 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921\": container with ID starting with f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921 not found: ID does not exist" containerID="f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.337675 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921"} err="failed to get container status \"f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921\": rpc error: code = NotFound desc = could not find container \"f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921\": container with ID starting with f12bc686d16e29a398d797c7d93eb650629a1119f27a57ac70860eec77875921 not found: ID does not exist" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.341159 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.341814 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4510cc0b-2614-4f66-8d12-e02dc78eec7a-logs\") pod \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.341926 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94cbp\" (UniqueName: \"kubernetes.io/projected/4510cc0b-2614-4f66-8d12-e02dc78eec7a-kube-api-access-94cbp\") pod \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.341976 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-combined-ca-bundle\") pod \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.342111 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-config-data\") pod \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\" (UID: \"4510cc0b-2614-4f66-8d12-e02dc78eec7a\") " Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.345908 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4510cc0b-2614-4f66-8d12-e02dc78eec7a-logs" (OuterVolumeSpecName: "logs") pod "4510cc0b-2614-4f66-8d12-e02dc78eec7a" (UID: "4510cc0b-2614-4f66-8d12-e02dc78eec7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.349714 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4510cc0b-2614-4f66-8d12-e02dc78eec7a-kube-api-access-94cbp" (OuterVolumeSpecName: "kube-api-access-94cbp") pod "4510cc0b-2614-4f66-8d12-e02dc78eec7a" (UID: "4510cc0b-2614-4f66-8d12-e02dc78eec7a"). InnerVolumeSpecName "kube-api-access-94cbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.352711 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: E0930 18:04:20.353355 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-api" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.353382 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-api" Sep 30 18:04:20 crc kubenswrapper[4797]: E0930 18:04:20.353416 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6f89ef-650a-461e-8a62-fb023d066be8" containerName="nova-scheduler-scheduler" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.353424 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6f89ef-650a-461e-8a62-fb023d066be8" containerName="nova-scheduler-scheduler" Sep 30 18:04:20 crc kubenswrapper[4797]: E0930 18:04:20.353476 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-log" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.353486 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-log" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.353716 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6f89ef-650a-461e-8a62-fb023d066be8" containerName="nova-scheduler-scheduler" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.353754 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-log" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.353772 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" containerName="nova-api-api" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.354578 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.356820 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.361456 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.402538 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4510cc0b-2614-4f66-8d12-e02dc78eec7a" (UID: "4510cc0b-2614-4f66-8d12-e02dc78eec7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.413606 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-config-data" (OuterVolumeSpecName: "config-data") pod "4510cc0b-2614-4f66-8d12-e02dc78eec7a" (UID: "4510cc0b-2614-4f66-8d12-e02dc78eec7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.444791 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-config-data\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.444910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.444948 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2zh\" (UniqueName: \"kubernetes.io/projected/733c5145-7413-4b60-9e71-2c96a13cb8b9-kube-api-access-4q2zh\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.445249 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.445292 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4510cc0b-2614-4f66-8d12-e02dc78eec7a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.445302 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94cbp\" (UniqueName: \"kubernetes.io/projected/4510cc0b-2614-4f66-8d12-e02dc78eec7a-kube-api-access-94cbp\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.445316 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4510cc0b-2614-4f66-8d12-e02dc78eec7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.547245 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2zh\" (UniqueName: \"kubernetes.io/projected/733c5145-7413-4b60-9e71-2c96a13cb8b9-kube-api-access-4q2zh\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.547384 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-config-data\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.547485 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.551888 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.552253 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-config-data\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.565089 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2zh\" (UniqueName: \"kubernetes.io/projected/733c5145-7413-4b60-9e71-2c96a13cb8b9-kube-api-access-4q2zh\") pod \"nova-scheduler-0\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.616198 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.626617 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.637656 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.639225 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.641812 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.655236 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.750063 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcb6x\" (UniqueName: \"kubernetes.io/projected/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-kube-api-access-qcb6x\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.750352 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-config-data\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.750401 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.750448 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-logs\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.759510 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.852228 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-config-data\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.852290 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.852321 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-logs\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.852355 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb6x\" (UniqueName: \"kubernetes.io/projected/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-kube-api-access-qcb6x\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.855682 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-logs\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.860198 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-config-data\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.861011 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:20 crc kubenswrapper[4797]: I0930 18:04:20.876422 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcb6x\" (UniqueName: \"kubernetes.io/projected/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-kube-api-access-qcb6x\") pod \"nova-api-0\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " pod="openstack/nova-api-0" Sep 30 18:04:21 crc kubenswrapper[4797]: I0930 18:04:21.011596 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:21 crc kubenswrapper[4797]: I0930 18:04:21.222594 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:21 crc kubenswrapper[4797]: I0930 18:04:21.283843 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"733c5145-7413-4b60-9e71-2c96a13cb8b9","Type":"ContainerStarted","Data":"ea02bacceb30db2ef0d8ed6d7c3c7128baf69420071d1352c48d8da953cb29db"} Sep 30 18:04:21 crc kubenswrapper[4797]: I0930 18:04:21.503312 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:21 crc kubenswrapper[4797]: I0930 18:04:21.633791 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 18:04:21 crc kubenswrapper[4797]: I0930 18:04:21.635491 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.248261 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4510cc0b-2614-4f66-8d12-e02dc78eec7a" path="/var/lib/kubelet/pods/4510cc0b-2614-4f66-8d12-e02dc78eec7a/volumes" Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.249179 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6f89ef-650a-461e-8a62-fb023d066be8" path="/var/lib/kubelet/pods/bf6f89ef-650a-461e-8a62-fb023d066be8/volumes" Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.299472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28d37cd4-62dd-4a1a-93f0-2840201b2ec8","Type":"ContainerStarted","Data":"4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c"} Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.299535 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28d37cd4-62dd-4a1a-93f0-2840201b2ec8","Type":"ContainerStarted","Data":"9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173"} Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.299553 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28d37cd4-62dd-4a1a-93f0-2840201b2ec8","Type":"ContainerStarted","Data":"90f46afc4095da79b62dae96bfa880fc3c49859d9112c11e5371e704dbbdfebd"} Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.304346 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"733c5145-7413-4b60-9e71-2c96a13cb8b9","Type":"ContainerStarted","Data":"4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543"} Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.326272 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.326244413 podStartE2EDuration="2.326244413s" podCreationTimestamp="2025-09-30 18:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:22.321801092 +0000 UTC m=+1312.844300330" watchObservedRunningTime="2025-09-30 18:04:22.326244413 +0000 UTC m=+1312.848743651" Sep 30 18:04:22 crc kubenswrapper[4797]: I0930 18:04:22.343309 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.343282988 podStartE2EDuration="2.343282988s" podCreationTimestamp="2025-09-30 18:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:22.34112538 +0000 UTC m=+1312.863624618" watchObservedRunningTime="2025-09-30 18:04:22.343282988 +0000 UTC m=+1312.865782226" Sep 30 18:04:23 crc kubenswrapper[4797]: I0930 18:04:23.443260 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 18:04:23 crc kubenswrapper[4797]: I0930 18:04:23.443803 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" containerName="kube-state-metrics" containerID="cri-o://e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078" gracePeriod=30 Sep 30 18:04:23 crc kubenswrapper[4797]: I0930 18:04:23.954125 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.024954 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvqvn\" (UniqueName: \"kubernetes.io/projected/6270d3ec-66cb-42c0-97d5-a83f4fe7c854-kube-api-access-pvqvn\") pod \"6270d3ec-66cb-42c0-97d5-a83f4fe7c854\" (UID: \"6270d3ec-66cb-42c0-97d5-a83f4fe7c854\") " Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.031678 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6270d3ec-66cb-42c0-97d5-a83f4fe7c854-kube-api-access-pvqvn" (OuterVolumeSpecName: "kube-api-access-pvqvn") pod "6270d3ec-66cb-42c0-97d5-a83f4fe7c854" (UID: "6270d3ec-66cb-42c0-97d5-a83f4fe7c854"). InnerVolumeSpecName "kube-api-access-pvqvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.127662 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvqvn\" (UniqueName: \"kubernetes.io/projected/6270d3ec-66cb-42c0-97d5-a83f4fe7c854-kube-api-access-pvqvn\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.328026 4797 generic.go:334] "Generic (PLEG): container finished" podID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" containerID="e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078" exitCode=2 Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.328072 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.328076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6270d3ec-66cb-42c0-97d5-a83f4fe7c854","Type":"ContainerDied","Data":"e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078"} Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.328931 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6270d3ec-66cb-42c0-97d5-a83f4fe7c854","Type":"ContainerDied","Data":"ab53a8a6e20e8b1a477125c63163c59d13702b6df47a8c825c3e8c83349cca29"} Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.328960 4797 scope.go:117] "RemoveContainer" containerID="e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.368296 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.383582 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.394687 4797 scope.go:117] "RemoveContainer" containerID="e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078" Sep 30 18:04:24 crc kubenswrapper[4797]: E0930 18:04:24.395214 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078\": container with ID starting with e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078 not found: ID does not exist" containerID="e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.395253 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078"} err="failed to get container status \"e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078\": rpc error: code = NotFound desc = could not find container \"e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078\": container with ID starting with e3899cd540db0b096c6c02a35a4b037d2ea07a0f88a5bc8553f2faada6486078 not found: ID does not exist" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.395296 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 18:04:24 crc kubenswrapper[4797]: E0930 18:04:24.413714 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" containerName="kube-state-metrics" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.413764 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" containerName="kube-state-metrics" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.414578 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" containerName="kube-state-metrics" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.415787 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.419668 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.419934 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.424709 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.543195 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.543250 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ks97\" (UniqueName: \"kubernetes.io/projected/1084da2e-fefb-4741-89c4-90257f878bf8-kube-api-access-9ks97\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.543575 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.543801 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.645949 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.646052 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.646077 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ks97\" (UniqueName: \"kubernetes.io/projected/1084da2e-fefb-4741-89c4-90257f878bf8-kube-api-access-9ks97\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.646161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.651639 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.652596 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.660333 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1084da2e-fefb-4741-89c4-90257f878bf8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.667942 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ks97\" (UniqueName: \"kubernetes.io/projected/1084da2e-fefb-4741-89c4-90257f878bf8-kube-api-access-9ks97\") pod \"kube-state-metrics-0\" (UID: \"1084da2e-fefb-4741-89c4-90257f878bf8\") " pod="openstack/kube-state-metrics-0" Sep 30 18:04:24 crc kubenswrapper[4797]: I0930 18:04:24.750536 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.232218 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.310309 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.310773 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-central-agent" containerID="cri-o://26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878" gracePeriod=30 Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.310894 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-notification-agent" containerID="cri-o://ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390" gracePeriod=30 Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.310874 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="sg-core" containerID="cri-o://a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373" gracePeriod=30 Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.310830 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="proxy-httpd" containerID="cri-o://bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb" gracePeriod=30 Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.341482 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1084da2e-fefb-4741-89c4-90257f878bf8","Type":"ContainerStarted","Data":"ce1d08228380e0fade5f37328091d24138cc585281c38da22ce5816681d44d50"} Sep 30 18:04:25 crc kubenswrapper[4797]: I0930 18:04:25.760523 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.255155 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" path="/var/lib/kubelet/pods/6270d3ec-66cb-42c0-97d5-a83f4fe7c854/volumes" Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.367873 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1084da2e-fefb-4741-89c4-90257f878bf8","Type":"ContainerStarted","Data":"cd8f437dbfb15cfb717f4fcc2d6fcadaa31d55eae6245d53ef07b7ee50fdf0d1"} Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.368077 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.373129 4797 generic.go:334] "Generic (PLEG): container finished" podID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerID="bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb" exitCode=0 Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.373157 4797 generic.go:334] "Generic (PLEG): container finished" podID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerID="a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373" exitCode=2 Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.373168 4797 generic.go:334] "Generic (PLEG): container finished" podID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerID="26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878" exitCode=0 Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.373205 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerDied","Data":"bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb"} Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.373235 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerDied","Data":"a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373"} Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.373246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerDied","Data":"26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878"} Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.634632 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 18:04:26 crc kubenswrapper[4797]: I0930 18:04:26.634708 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 18:04:27 crc kubenswrapper[4797]: I0930 18:04:27.646951 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:04:27 crc kubenswrapper[4797]: I0930 18:04:27.646982 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:04:27 crc kubenswrapper[4797]: I0930 18:04:27.990339 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.016746 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.648774702 podStartE2EDuration="4.016724756s" podCreationTimestamp="2025-09-30 18:04:24 +0000 UTC" firstStartedPulling="2025-09-30 18:04:25.228741537 +0000 UTC m=+1315.751240815" lastFinishedPulling="2025-09-30 18:04:25.596691641 +0000 UTC m=+1316.119190869" observedRunningTime="2025-09-30 18:04:26.400950083 +0000 UTC m=+1316.923449321" watchObservedRunningTime="2025-09-30 18:04:28.016724756 +0000 UTC m=+1318.539223994" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119216 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq8lc\" (UniqueName: \"kubernetes.io/projected/72b870b2-8f3b-4d26-9c77-272986d31c97-kube-api-access-bq8lc\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119287 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-config-data\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119367 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-run-httpd\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119420 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-combined-ca-bundle\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119481 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-log-httpd\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119540 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-sg-core-conf-yaml\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.119581 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-scripts\") pod \"72b870b2-8f3b-4d26-9c77-272986d31c97\" (UID: \"72b870b2-8f3b-4d26-9c77-272986d31c97\") " Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.120774 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.120840 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.126663 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b870b2-8f3b-4d26-9c77-272986d31c97-kube-api-access-bq8lc" (OuterVolumeSpecName: "kube-api-access-bq8lc") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "kube-api-access-bq8lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.144579 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-scripts" (OuterVolumeSpecName: "scripts") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.160526 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.208094 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.224183 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq8lc\" (UniqueName: \"kubernetes.io/projected/72b870b2-8f3b-4d26-9c77-272986d31c97-kube-api-access-bq8lc\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.224223 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.224237 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.224247 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72b870b2-8f3b-4d26-9c77-272986d31c97-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.224260 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.224270 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.250894 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-config-data" (OuterVolumeSpecName: "config-data") pod "72b870b2-8f3b-4d26-9c77-272986d31c97" (UID: "72b870b2-8f3b-4d26-9c77-272986d31c97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.326396 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b870b2-8f3b-4d26-9c77-272986d31c97-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.400374 4797 generic.go:334] "Generic (PLEG): container finished" podID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerID="ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390" exitCode=0 Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.400456 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.400506 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerDied","Data":"ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390"} Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.400861 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72b870b2-8f3b-4d26-9c77-272986d31c97","Type":"ContainerDied","Data":"2d5976bd4237764207bc3b6b2c8032d34f84bd5feb417cd97523c7438fbd8887"} Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.400937 4797 scope.go:117] "RemoveContainer" containerID="bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.429763 4797 scope.go:117] "RemoveContainer" containerID="a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.434102 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.483013 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.491929 4797 scope.go:117] "RemoveContainer" containerID="ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.500118 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.501112 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-notification-agent" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.501188 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-notification-agent" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.501267 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-central-agent" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.501316 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-central-agent" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.501369 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="proxy-httpd" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.501414 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="proxy-httpd" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.501527 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="sg-core" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.501587 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="sg-core" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.501842 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-notification-agent" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.501975 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="ceilometer-central-agent" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.502043 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="sg-core" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.502094 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" containerName="proxy-httpd" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.504893 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.507031 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.507677 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.508145 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.535590 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.539407 4797 scope.go:117] "RemoveContainer" containerID="26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.600680 4797 scope.go:117] "RemoveContainer" containerID="bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.602805 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb\": container with ID starting with bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb not found: ID does not exist" containerID="bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.602845 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb"} err="failed to get container status \"bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb\": rpc error: code = NotFound desc = could not find container \"bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb\": container with ID starting with bae0f980fdff23c16458845d2413975af507f1f6437614408baf76a45715ddfb not found: ID does not exist" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.602870 4797 scope.go:117] "RemoveContainer" containerID="a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.603170 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373\": container with ID starting with a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373 not found: ID does not exist" containerID="a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.603211 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373"} err="failed to get container status \"a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373\": rpc error: code = NotFound desc = could not find container \"a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373\": container with ID starting with a4b68720b0caf4a4bf164b8efcbf5dddc44c5de341c9e6143377857a5165a373 not found: ID does not exist" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.603236 4797 scope.go:117] "RemoveContainer" containerID="ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.603705 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390\": container with ID starting with ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390 not found: ID does not exist" containerID="ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.603728 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390"} err="failed to get container status \"ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390\": rpc error: code = NotFound desc = could not find container \"ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390\": container with ID starting with ae21598f4428179183f3de0266ae84e6bc755e9a9381e799340048f1f22ff390 not found: ID does not exist" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.603742 4797 scope.go:117] "RemoveContainer" containerID="26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878" Sep 30 18:04:28 crc kubenswrapper[4797]: E0930 18:04:28.603976 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878\": container with ID starting with 26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878 not found: ID does not exist" containerID="26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.604001 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878"} err="failed to get container status \"26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878\": rpc error: code = NotFound desc = could not find container \"26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878\": container with ID starting with 26d1d0645c438566574579c7189d83a12c9ea6f6fc55ca096d73ff4be81dd878 not found: ID does not exist" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634095 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgpk\" (UniqueName: \"kubernetes.io/projected/c8fce643-abdb-4582-940d-ddfc5bb35f56-kube-api-access-xfgpk\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634141 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-config-data\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634161 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-scripts\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634261 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634291 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-run-httpd\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634310 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634330 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-log-httpd\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.634531 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736524 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgpk\" (UniqueName: \"kubernetes.io/projected/c8fce643-abdb-4582-940d-ddfc5bb35f56-kube-api-access-xfgpk\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736785 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-config-data\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736804 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-scripts\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736875 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736908 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-run-httpd\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736931 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736960 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-log-httpd\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.736978 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.737489 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-log-httpd\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.737515 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-run-httpd\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.740752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.741420 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-scripts\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.741676 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-config-data\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.744059 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.756301 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.759328 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgpk\" (UniqueName: \"kubernetes.io/projected/c8fce643-abdb-4582-940d-ddfc5bb35f56-kube-api-access-xfgpk\") pod \"ceilometer-0\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " pod="openstack/ceilometer-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.817880 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="6270d3ec-66cb-42c0-97d5-a83f4fe7c854" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.846090 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 18:04:28 crc kubenswrapper[4797]: I0930 18:04:28.893854 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:29 crc kubenswrapper[4797]: I0930 18:04:29.367670 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:29 crc kubenswrapper[4797]: I0930 18:04:29.423325 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerStarted","Data":"1ec2ba3cd6b6ee7b792fcc0e317d6fb2b1b8c0ecc77fc94ed5e005a3f75df06b"} Sep 30 18:04:30 crc kubenswrapper[4797]: I0930 18:04:30.253978 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b870b2-8f3b-4d26-9c77-272986d31c97" path="/var/lib/kubelet/pods/72b870b2-8f3b-4d26-9c77-272986d31c97/volumes" Sep 30 18:04:30 crc kubenswrapper[4797]: I0930 18:04:30.440415 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerStarted","Data":"8deeb4078b595a04d370deff8babe506af3234ac17b94a6acf8f5af4a2c770c8"} Sep 30 18:04:30 crc kubenswrapper[4797]: I0930 18:04:30.760666 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 18:04:30 crc kubenswrapper[4797]: I0930 18:04:30.808463 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 18:04:31 crc kubenswrapper[4797]: I0930 18:04:31.012185 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 18:04:31 crc kubenswrapper[4797]: I0930 18:04:31.014007 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 18:04:31 crc kubenswrapper[4797]: I0930 18:04:31.457737 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerStarted","Data":"081189dd0e3c0c1e205378345b9128f7a752a1e57d87819adcaa62546092b3d7"} Sep 30 18:04:31 crc kubenswrapper[4797]: I0930 18:04:31.499327 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 18:04:32 crc kubenswrapper[4797]: I0930 18:04:32.094760 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.218:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:04:32 crc kubenswrapper[4797]: I0930 18:04:32.094821 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.218:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:04:33 crc kubenswrapper[4797]: I0930 18:04:33.497375 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerStarted","Data":"09ce2456ae699edb4cfd1e1689832e775f1795d95fea1447dcc17a3e29e86516"} Sep 30 18:04:34 crc kubenswrapper[4797]: I0930 18:04:34.509883 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerStarted","Data":"25a591a7d2082190655a447bc0cbf2d15553414776ce37e051efcc30426c502d"} Sep 30 18:04:34 crc kubenswrapper[4797]: I0930 18:04:34.510422 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:04:34 crc kubenswrapper[4797]: I0930 18:04:34.767756 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 18:04:34 crc kubenswrapper[4797]: I0930 18:04:34.789119 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9601584810000001 podStartE2EDuration="6.789096208s" podCreationTimestamp="2025-09-30 18:04:28 +0000 UTC" firstStartedPulling="2025-09-30 18:04:29.383643995 +0000 UTC m=+1319.906143233" lastFinishedPulling="2025-09-30 18:04:34.212581722 +0000 UTC m=+1324.735080960" observedRunningTime="2025-09-30 18:04:34.536265287 +0000 UTC m=+1325.058764515" watchObservedRunningTime="2025-09-30 18:04:34.789096208 +0000 UTC m=+1325.311595446" Sep 30 18:04:36 crc kubenswrapper[4797]: I0930 18:04:36.641499 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 18:04:36 crc kubenswrapper[4797]: I0930 18:04:36.643518 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 18:04:36 crc kubenswrapper[4797]: I0930 18:04:36.648550 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 18:04:37 crc kubenswrapper[4797]: I0930 18:04:37.666916 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.532172 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.581024 4797 generic.go:334] "Generic (PLEG): container finished" podID="93447e34-8d1f-42de-8345-224b4154ad10" containerID="2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7" exitCode=137 Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.581091 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.581123 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93447e34-8d1f-42de-8345-224b4154ad10","Type":"ContainerDied","Data":"2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7"} Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.581162 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93447e34-8d1f-42de-8345-224b4154ad10","Type":"ContainerDied","Data":"2a30613434079fe3a0c28e7f7b4a0cd6b6b8c8392fcaa3ed5d1ad677c9a4e66e"} Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.581186 4797 scope.go:117] "RemoveContainer" containerID="2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.607738 4797 scope.go:117] "RemoveContainer" containerID="2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7" Sep 30 18:04:38 crc kubenswrapper[4797]: E0930 18:04:38.608176 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7\": container with ID starting with 2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7 not found: ID does not exist" containerID="2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.608225 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7"} err="failed to get container status \"2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7\": rpc error: code = NotFound desc = could not find container \"2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7\": container with ID starting with 2123f08d6bb658f9dc2e74fd099baa4d2fdffc8b5b376b7620790363dfeea5c7 not found: ID does not exist" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.671534 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-config-data\") pod \"93447e34-8d1f-42de-8345-224b4154ad10\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.671645 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvngx\" (UniqueName: \"kubernetes.io/projected/93447e34-8d1f-42de-8345-224b4154ad10-kube-api-access-jvngx\") pod \"93447e34-8d1f-42de-8345-224b4154ad10\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.671698 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-combined-ca-bundle\") pod \"93447e34-8d1f-42de-8345-224b4154ad10\" (UID: \"93447e34-8d1f-42de-8345-224b4154ad10\") " Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.695075 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93447e34-8d1f-42de-8345-224b4154ad10-kube-api-access-jvngx" (OuterVolumeSpecName: "kube-api-access-jvngx") pod "93447e34-8d1f-42de-8345-224b4154ad10" (UID: "93447e34-8d1f-42de-8345-224b4154ad10"). InnerVolumeSpecName "kube-api-access-jvngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.704086 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-config-data" (OuterVolumeSpecName: "config-data") pod "93447e34-8d1f-42de-8345-224b4154ad10" (UID: "93447e34-8d1f-42de-8345-224b4154ad10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.724064 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93447e34-8d1f-42de-8345-224b4154ad10" (UID: "93447e34-8d1f-42de-8345-224b4154ad10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.776114 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.776139 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvngx\" (UniqueName: \"kubernetes.io/projected/93447e34-8d1f-42de-8345-224b4154ad10-kube-api-access-jvngx\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.776151 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93447e34-8d1f-42de-8345-224b4154ad10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.927306 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.938982 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.950797 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:38 crc kubenswrapper[4797]: E0930 18:04:38.951310 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93447e34-8d1f-42de-8345-224b4154ad10" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.951329 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="93447e34-8d1f-42de-8345-224b4154ad10" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.951586 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="93447e34-8d1f-42de-8345-224b4154ad10" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.952411 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.957464 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.959783 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.960610 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.984555 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.984710 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksrw8\" (UniqueName: \"kubernetes.io/projected/e16d4537-8c99-431f-bcd6-da24200f085b-kube-api-access-ksrw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.984771 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.984804 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:38 crc kubenswrapper[4797]: I0930 18:04:38.984910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.000785 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.086451 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.086508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.086528 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.086653 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.086710 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksrw8\" (UniqueName: \"kubernetes.io/projected/e16d4537-8c99-431f-bcd6-da24200f085b-kube-api-access-ksrw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.090645 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.091106 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.092388 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.092451 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16d4537-8c99-431f-bcd6-da24200f085b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.104860 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksrw8\" (UniqueName: \"kubernetes.io/projected/e16d4537-8c99-431f-bcd6-da24200f085b-kube-api-access-ksrw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16d4537-8c99-431f-bcd6-da24200f085b\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.288512 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:39 crc kubenswrapper[4797]: I0930 18:04:39.775685 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 18:04:40 crc kubenswrapper[4797]: I0930 18:04:40.251069 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93447e34-8d1f-42de-8345-224b4154ad10" path="/var/lib/kubelet/pods/93447e34-8d1f-42de-8345-224b4154ad10/volumes" Sep 30 18:04:40 crc kubenswrapper[4797]: I0930 18:04:40.604619 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e16d4537-8c99-431f-bcd6-da24200f085b","Type":"ContainerStarted","Data":"8a04f3fe7657711b1c686aa58f107ee3a86a2c6cb04fdf9a2069e59abe47ebd5"} Sep 30 18:04:40 crc kubenswrapper[4797]: I0930 18:04:40.604663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e16d4537-8c99-431f-bcd6-da24200f085b","Type":"ContainerStarted","Data":"9afa4ee9e10f226fe765034a19e84eba9485588dd834766489373264a3dee7c3"} Sep 30 18:04:40 crc kubenswrapper[4797]: I0930 18:04:40.629777 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.62975164 podStartE2EDuration="2.62975164s" podCreationTimestamp="2025-09-30 18:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:40.62721549 +0000 UTC m=+1331.149714748" watchObservedRunningTime="2025-09-30 18:04:40.62975164 +0000 UTC m=+1331.152250898" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.022199 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.023245 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.023539 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.028130 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.615527 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.769175 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.923039 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-vqf8r"] Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.925156 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.944259 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-vqf8r"] Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.951006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.951062 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.951283 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-config\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.951518 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.951603 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:41 crc kubenswrapper[4797]: I0930 18:04:41.951632 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjd99\" (UniqueName: \"kubernetes.io/projected/1494eb69-c2b0-489c-a348-dfa4047c09db-kube-api-access-mjd99\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.053292 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.054074 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjd99\" (UniqueName: \"kubernetes.io/projected/1494eb69-c2b0-489c-a348-dfa4047c09db-kube-api-access-mjd99\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.054170 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.054261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.054338 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.054920 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.054139 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.055943 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.056153 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-config\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.056207 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.057022 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-config\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.082645 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjd99\" (UniqueName: \"kubernetes.io/projected/1494eb69-c2b0-489c-a348-dfa4047c09db-kube-api-access-mjd99\") pod \"dnsmasq-dns-59cf4bdb65-vqf8r\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.250867 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:42 crc kubenswrapper[4797]: I0930 18:04:42.699167 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-vqf8r"] Sep 30 18:04:42 crc kubenswrapper[4797]: W0930 18:04:42.705476 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1494eb69_c2b0_489c_a348_dfa4047c09db.slice/crio-4b235b6f5c96ea7558a10d76aabf48b21899ee5eb4dca4cff686fd25a6c6ae34 WatchSource:0}: Error finding container 4b235b6f5c96ea7558a10d76aabf48b21899ee5eb4dca4cff686fd25a6c6ae34: Status 404 returned error can't find the container with id 4b235b6f5c96ea7558a10d76aabf48b21899ee5eb4dca4cff686fd25a6c6ae34 Sep 30 18:04:43 crc kubenswrapper[4797]: I0930 18:04:43.632724 4797 generic.go:334] "Generic (PLEG): container finished" podID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerID="fcb3b81368cfdd97e74b410c964dbfbdbb87d0936a08eeb9ff34372e922835bf" exitCode=0 Sep 30 18:04:43 crc kubenswrapper[4797]: I0930 18:04:43.632827 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" event={"ID":"1494eb69-c2b0-489c-a348-dfa4047c09db","Type":"ContainerDied","Data":"fcb3b81368cfdd97e74b410c964dbfbdbb87d0936a08eeb9ff34372e922835bf"} Sep 30 18:04:43 crc kubenswrapper[4797]: I0930 18:04:43.633369 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" event={"ID":"1494eb69-c2b0-489c-a348-dfa4047c09db","Type":"ContainerStarted","Data":"4b235b6f5c96ea7558a10d76aabf48b21899ee5eb4dca4cff686fd25a6c6ae34"} Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.036986 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.037566 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-central-agent" containerID="cri-o://8deeb4078b595a04d370deff8babe506af3234ac17b94a6acf8f5af4a2c770c8" gracePeriod=30 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.037656 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="proxy-httpd" containerID="cri-o://25a591a7d2082190655a447bc0cbf2d15553414776ce37e051efcc30426c502d" gracePeriod=30 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.037690 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-notification-agent" containerID="cri-o://081189dd0e3c0c1e205378345b9128f7a752a1e57d87819adcaa62546092b3d7" gracePeriod=30 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.037674 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="sg-core" containerID="cri-o://09ce2456ae699edb4cfd1e1689832e775f1795d95fea1447dcc17a3e29e86516" gracePeriod=30 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.050769 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.191345 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.191394 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.289448 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.310748 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.645231 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" event={"ID":"1494eb69-c2b0-489c-a348-dfa4047c09db","Type":"ContainerStarted","Data":"e60736780c3b04991c1feb25978907608a50de80fac89989504fa0954d452009"} Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.645309 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649168 4797 generic.go:334] "Generic (PLEG): container finished" podID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerID="25a591a7d2082190655a447bc0cbf2d15553414776ce37e051efcc30426c502d" exitCode=0 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649196 4797 generic.go:334] "Generic (PLEG): container finished" podID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerID="09ce2456ae699edb4cfd1e1689832e775f1795d95fea1447dcc17a3e29e86516" exitCode=2 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649205 4797 generic.go:334] "Generic (PLEG): container finished" podID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerID="8deeb4078b595a04d370deff8babe506af3234ac17b94a6acf8f5af4a2c770c8" exitCode=0 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649249 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerDied","Data":"25a591a7d2082190655a447bc0cbf2d15553414776ce37e051efcc30426c502d"} Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649297 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerDied","Data":"09ce2456ae699edb4cfd1e1689832e775f1795d95fea1447dcc17a3e29e86516"} Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649308 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerDied","Data":"8deeb4078b595a04d370deff8babe506af3234ac17b94a6acf8f5af4a2c770c8"} Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.649364 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-log" containerID="cri-o://9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173" gracePeriod=30 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.650455 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-api" containerID="cri-o://4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c" gracePeriod=30 Sep 30 18:04:44 crc kubenswrapper[4797]: I0930 18:04:44.680709 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" podStartSLOduration=3.68068831 podStartE2EDuration="3.68068831s" podCreationTimestamp="2025-09-30 18:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:44.673854474 +0000 UTC m=+1335.196353712" watchObservedRunningTime="2025-09-30 18:04:44.68068831 +0000 UTC m=+1335.203187548" Sep 30 18:04:45 crc kubenswrapper[4797]: I0930 18:04:45.669757 4797 generic.go:334] "Generic (PLEG): container finished" podID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerID="9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173" exitCode=143 Sep 30 18:04:45 crc kubenswrapper[4797]: I0930 18:04:45.670317 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28d37cd4-62dd-4a1a-93f0-2840201b2ec8","Type":"ContainerDied","Data":"9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173"} Sep 30 18:04:47 crc kubenswrapper[4797]: I0930 18:04:47.705129 4797 generic.go:334] "Generic (PLEG): container finished" podID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerID="081189dd0e3c0c1e205378345b9128f7a752a1e57d87819adcaa62546092b3d7" exitCode=0 Sep 30 18:04:47 crc kubenswrapper[4797]: I0930 18:04:47.705242 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerDied","Data":"081189dd0e3c0c1e205378345b9128f7a752a1e57d87819adcaa62546092b3d7"} Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.024458 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.201685 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-run-httpd\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202201 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-config-data\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202257 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-scripts\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202280 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202299 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-sg-core-conf-yaml\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202383 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-ceilometer-tls-certs\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202535 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-combined-ca-bundle\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202599 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfgpk\" (UniqueName: \"kubernetes.io/projected/c8fce643-abdb-4582-940d-ddfc5bb35f56-kube-api-access-xfgpk\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.202661 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-log-httpd\") pod \"c8fce643-abdb-4582-940d-ddfc5bb35f56\" (UID: \"c8fce643-abdb-4582-940d-ddfc5bb35f56\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.203248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.203495 4797 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.203513 4797 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8fce643-abdb-4582-940d-ddfc5bb35f56-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.208117 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-scripts" (OuterVolumeSpecName: "scripts") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.208284 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8fce643-abdb-4582-940d-ddfc5bb35f56-kube-api-access-xfgpk" (OuterVolumeSpecName: "kube-api-access-xfgpk") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "kube-api-access-xfgpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.238248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.260917 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.301726 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.305198 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.305226 4797 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.305240 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.305253 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.305264 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfgpk\" (UniqueName: \"kubernetes.io/projected/c8fce643-abdb-4582-940d-ddfc5bb35f56-kube-api-access-xfgpk\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.332932 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.336765 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-config-data" (OuterVolumeSpecName: "config-data") pod "c8fce643-abdb-4582-940d-ddfc5bb35f56" (UID: "c8fce643-abdb-4582-940d-ddfc5bb35f56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.407647 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fce643-abdb-4582-940d-ddfc5bb35f56-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.509275 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-combined-ca-bundle\") pod \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.509453 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-config-data\") pod \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.509503 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-logs\") pod \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.509753 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcb6x\" (UniqueName: \"kubernetes.io/projected/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-kube-api-access-qcb6x\") pod \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\" (UID: \"28d37cd4-62dd-4a1a-93f0-2840201b2ec8\") " Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.511018 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-logs" (OuterVolumeSpecName: "logs") pod "28d37cd4-62dd-4a1a-93f0-2840201b2ec8" (UID: "28d37cd4-62dd-4a1a-93f0-2840201b2ec8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.511352 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.514328 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-kube-api-access-qcb6x" (OuterVolumeSpecName: "kube-api-access-qcb6x") pod "28d37cd4-62dd-4a1a-93f0-2840201b2ec8" (UID: "28d37cd4-62dd-4a1a-93f0-2840201b2ec8"). InnerVolumeSpecName "kube-api-access-qcb6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.540410 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28d37cd4-62dd-4a1a-93f0-2840201b2ec8" (UID: "28d37cd4-62dd-4a1a-93f0-2840201b2ec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.540816 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-config-data" (OuterVolumeSpecName: "config-data") pod "28d37cd4-62dd-4a1a-93f0-2840201b2ec8" (UID: "28d37cd4-62dd-4a1a-93f0-2840201b2ec8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.613422 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.613473 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.613482 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcb6x\" (UniqueName: \"kubernetes.io/projected/28d37cd4-62dd-4a1a-93f0-2840201b2ec8-kube-api-access-qcb6x\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.718314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8fce643-abdb-4582-940d-ddfc5bb35f56","Type":"ContainerDied","Data":"1ec2ba3cd6b6ee7b792fcc0e317d6fb2b1b8c0ecc77fc94ed5e005a3f75df06b"} Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.718640 4797 scope.go:117] "RemoveContainer" containerID="25a591a7d2082190655a447bc0cbf2d15553414776ce37e051efcc30426c502d" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.718367 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.732478 4797 generic.go:334] "Generic (PLEG): container finished" podID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerID="4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c" exitCode=0 Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.732524 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28d37cd4-62dd-4a1a-93f0-2840201b2ec8","Type":"ContainerDied","Data":"4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c"} Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.732573 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28d37cd4-62dd-4a1a-93f0-2840201b2ec8","Type":"ContainerDied","Data":"90f46afc4095da79b62dae96bfa880fc3c49859d9112c11e5371e704dbbdfebd"} Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.732535 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.744571 4797 scope.go:117] "RemoveContainer" containerID="09ce2456ae699edb4cfd1e1689832e775f1795d95fea1447dcc17a3e29e86516" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.755365 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.774798 4797 scope.go:117] "RemoveContainer" containerID="081189dd0e3c0c1e205378345b9128f7a752a1e57d87819adcaa62546092b3d7" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.788693 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.798816 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.813564 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822047 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.822507 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="proxy-httpd" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822524 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="proxy-httpd" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.822551 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="sg-core" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822558 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="sg-core" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.822579 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-central-agent" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822585 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-central-agent" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.822644 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-log" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822653 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-log" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.822669 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-notification-agent" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822676 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-notification-agent" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.822701 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-api" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822706 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-api" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822915 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-log" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822932 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-notification-agent" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822941 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="ceilometer-central-agent" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822953 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" containerName="nova-api-api" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822963 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="sg-core" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.822971 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" containerName="proxy-httpd" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.824900 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.827640 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.827668 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.827765 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.831581 4797 scope.go:117] "RemoveContainer" containerID="8deeb4078b595a04d370deff8babe506af3234ac17b94a6acf8f5af4a2c770c8" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.855721 4797 scope.go:117] "RemoveContainer" containerID="4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.858378 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.860224 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.862537 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.862778 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.863855 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.884966 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.888779 4797 scope.go:117] "RemoveContainer" containerID="9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.897017 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.908782 4797 scope.go:117] "RemoveContainer" containerID="4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.912083 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c\": container with ID starting with 4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c not found: ID does not exist" containerID="4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.912448 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c"} err="failed to get container status \"4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c\": rpc error: code = NotFound desc = could not find container \"4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c\": container with ID starting with 4fa84312c3b9b9d93b71dbf3d89023c69e4e85d60f338e89f296c3727c00419c not found: ID does not exist" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.912483 4797 scope.go:117] "RemoveContainer" containerID="9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173" Sep 30 18:04:48 crc kubenswrapper[4797]: E0930 18:04:48.912867 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173\": container with ID starting with 9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173 not found: ID does not exist" containerID="9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.912955 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173"} err="failed to get container status \"9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173\": rpc error: code = NotFound desc = could not find container \"9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173\": container with ID starting with 9f57a385b3232c7406af2e9b9fc014481b41fd441ecb556c1e7217baeb5a8173 not found: ID does not exist" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.926499 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.926563 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-config-data\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.926645 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqr9l\" (UniqueName: \"kubernetes.io/projected/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-kube-api-access-sqr9l\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.926671 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-scripts\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.927110 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.927186 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.927244 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-log-httpd\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:48 crc kubenswrapper[4797]: I0930 18:04:48.927279 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-run-httpd\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.029889 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlw8b\" (UniqueName: \"kubernetes.io/projected/33ee1370-8400-445a-ae36-73ba882e1901-kube-api-access-xlw8b\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.029977 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqr9l\" (UniqueName: \"kubernetes.io/projected/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-kube-api-access-sqr9l\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030004 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-scripts\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030073 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-public-tls-certs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030631 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33ee1370-8400-445a-ae36-73ba882e1901-logs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030720 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-config-data\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030781 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030856 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030895 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030922 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-log-httpd\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.030952 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-run-httpd\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.031008 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.031066 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.031099 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-config-data\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.031601 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-run-httpd\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.032103 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-log-httpd\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.036159 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.036229 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-scripts\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.036546 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.036649 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-config-data\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.038817 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.046790 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqr9l\" (UniqueName: \"kubernetes.io/projected/239988d8-f0f2-49d2-95aa-2f50d3b1f5ce-kube-api-access-sqr9l\") pod \"ceilometer-0\" (UID: \"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce\") " pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.132956 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlw8b\" (UniqueName: \"kubernetes.io/projected/33ee1370-8400-445a-ae36-73ba882e1901-kube-api-access-xlw8b\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.133025 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-public-tls-certs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.133088 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33ee1370-8400-445a-ae36-73ba882e1901-logs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.133151 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-config-data\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.133192 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.133282 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.133783 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33ee1370-8400-445a-ae36-73ba882e1901-logs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.136380 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-public-tls-certs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.137474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.137605 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.138989 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-config-data\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.155373 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.164868 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlw8b\" (UniqueName: \"kubernetes.io/projected/33ee1370-8400-445a-ae36-73ba882e1901-kube-api-access-xlw8b\") pod \"nova-api-0\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.193846 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.290322 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.313541 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.651849 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.722796 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:49 crc kubenswrapper[4797]: W0930 18:04:49.726941 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ee1370_8400_445a_ae36_73ba882e1901.slice/crio-804d8860ca918b127f10426908e1d1f233e338d1267e3d4f50c2e55c705929f0 WatchSource:0}: Error finding container 804d8860ca918b127f10426908e1d1f233e338d1267e3d4f50c2e55c705929f0: Status 404 returned error can't find the container with id 804d8860ca918b127f10426908e1d1f233e338d1267e3d4f50c2e55c705929f0 Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.758766 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33ee1370-8400-445a-ae36-73ba882e1901","Type":"ContainerStarted","Data":"804d8860ca918b127f10426908e1d1f233e338d1267e3d4f50c2e55c705929f0"} Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.763933 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce","Type":"ContainerStarted","Data":"69953e5c3ac46e1e677b85284dfd8f8a4e3ec2c48d401bc0ee2bf6cba3323d11"} Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.784757 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.955616 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-v7dn7"] Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.957114 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.960307 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.960987 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 18:04:49 crc kubenswrapper[4797]: I0930 18:04:49.974141 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v7dn7"] Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.049956 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-config-data\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.050012 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.050039 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4knjs\" (UniqueName: \"kubernetes.io/projected/f803581d-a565-4443-8672-c57c027a3af5-kube-api-access-4knjs\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.050109 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-scripts\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.151934 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-config-data\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.151999 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.152027 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4knjs\" (UniqueName: \"kubernetes.io/projected/f803581d-a565-4443-8672-c57c027a3af5-kube-api-access-4knjs\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.152108 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-scripts\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.157025 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-config-data\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.157951 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-scripts\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.158227 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.168775 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4knjs\" (UniqueName: \"kubernetes.io/projected/f803581d-a565-4443-8672-c57c027a3af5-kube-api-access-4knjs\") pod \"nova-cell1-cell-mapping-v7dn7\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.250380 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d37cd4-62dd-4a1a-93f0-2840201b2ec8" path="/var/lib/kubelet/pods/28d37cd4-62dd-4a1a-93f0-2840201b2ec8/volumes" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.251246 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8fce643-abdb-4582-940d-ddfc5bb35f56" path="/var/lib/kubelet/pods/c8fce643-abdb-4582-940d-ddfc5bb35f56/volumes" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.283604 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.772980 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v7dn7"] Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.783541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33ee1370-8400-445a-ae36-73ba882e1901","Type":"ContainerStarted","Data":"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a"} Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.783582 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33ee1370-8400-445a-ae36-73ba882e1901","Type":"ContainerStarted","Data":"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43"} Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.796523 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce","Type":"ContainerStarted","Data":"9c8c60e80d8333c1fc42fbed94314ffeb2f49100115947d3a0a74025527b8615"} Sep 30 18:04:50 crc kubenswrapper[4797]: I0930 18:04:50.824765 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.824744743 podStartE2EDuration="2.824744743s" podCreationTimestamp="2025-09-30 18:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:50.812659503 +0000 UTC m=+1341.335158741" watchObservedRunningTime="2025-09-30 18:04:50.824744743 +0000 UTC m=+1341.347243981" Sep 30 18:04:51 crc kubenswrapper[4797]: I0930 18:04:51.852306 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce","Type":"ContainerStarted","Data":"cccb0dbfd937bc9868a475a17c99adc7af488639754c8c3b71df3473215cf6b9"} Sep 30 18:04:51 crc kubenswrapper[4797]: I0930 18:04:51.858694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v7dn7" event={"ID":"f803581d-a565-4443-8672-c57c027a3af5","Type":"ContainerStarted","Data":"0f3d0d7fe75661e083dcc2e480d29f62590dc9195787680c6756b070718e39c9"} Sep 30 18:04:51 crc kubenswrapper[4797]: I0930 18:04:51.858890 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v7dn7" event={"ID":"f803581d-a565-4443-8672-c57c027a3af5","Type":"ContainerStarted","Data":"9f94441e7728e6daa19dbb6e70db40fe8e846b8fc276255d137833e956454709"} Sep 30 18:04:51 crc kubenswrapper[4797]: I0930 18:04:51.893354 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-v7dn7" podStartSLOduration=2.89332733 podStartE2EDuration="2.89332733s" podCreationTimestamp="2025-09-30 18:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:04:51.878862415 +0000 UTC m=+1342.401361653" watchObservedRunningTime="2025-09-30 18:04:51.89332733 +0000 UTC m=+1342.415826568" Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.255872 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.336257 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-69vds"] Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.337281 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" podUID="74964b4b-482a-4d85-a081-7e0625c93056" containerName="dnsmasq-dns" containerID="cri-o://d8ba5334b065be7485c370f94caa6e088fe691fe068ffdd5a3ca62405baf8b7b" gracePeriod=10 Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.871407 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce","Type":"ContainerStarted","Data":"3a77c13cc880ba554774cf8b1b607ea595201cd3b5c55753c3c8391e3cdf07fa"} Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.873201 4797 generic.go:334] "Generic (PLEG): container finished" podID="74964b4b-482a-4d85-a081-7e0625c93056" containerID="d8ba5334b065be7485c370f94caa6e088fe691fe068ffdd5a3ca62405baf8b7b" exitCode=0 Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.873308 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" event={"ID":"74964b4b-482a-4d85-a081-7e0625c93056","Type":"ContainerDied","Data":"d8ba5334b065be7485c370f94caa6e088fe691fe068ffdd5a3ca62405baf8b7b"} Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.873369 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" event={"ID":"74964b4b-482a-4d85-a081-7e0625c93056","Type":"ContainerDied","Data":"1c366c5a89b9eb1dbc04ce08a1ba24393a3682cd24fe0382e8b588523aecfff9"} Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.873385 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c366c5a89b9eb1dbc04ce08a1ba24393a3682cd24fe0382e8b588523aecfff9" Sep 30 18:04:52 crc kubenswrapper[4797]: I0930 18:04:52.892361 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.008583 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-sb\") pod \"74964b4b-482a-4d85-a081-7e0625c93056\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.008685 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-swift-storage-0\") pod \"74964b4b-482a-4d85-a081-7e0625c93056\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.008728 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-nb\") pod \"74964b4b-482a-4d85-a081-7e0625c93056\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.008821 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-svc\") pod \"74964b4b-482a-4d85-a081-7e0625c93056\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.009473 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlwgh\" (UniqueName: \"kubernetes.io/projected/74964b4b-482a-4d85-a081-7e0625c93056-kube-api-access-vlwgh\") pod \"74964b4b-482a-4d85-a081-7e0625c93056\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.009645 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-config\") pod \"74964b4b-482a-4d85-a081-7e0625c93056\" (UID: \"74964b4b-482a-4d85-a081-7e0625c93056\") " Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.014098 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74964b4b-482a-4d85-a081-7e0625c93056-kube-api-access-vlwgh" (OuterVolumeSpecName: "kube-api-access-vlwgh") pod "74964b4b-482a-4d85-a081-7e0625c93056" (UID: "74964b4b-482a-4d85-a081-7e0625c93056"). InnerVolumeSpecName "kube-api-access-vlwgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.072598 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-config" (OuterVolumeSpecName: "config") pod "74964b4b-482a-4d85-a081-7e0625c93056" (UID: "74964b4b-482a-4d85-a081-7e0625c93056"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.073942 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74964b4b-482a-4d85-a081-7e0625c93056" (UID: "74964b4b-482a-4d85-a081-7e0625c93056"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.094169 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74964b4b-482a-4d85-a081-7e0625c93056" (UID: "74964b4b-482a-4d85-a081-7e0625c93056"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.097888 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74964b4b-482a-4d85-a081-7e0625c93056" (UID: "74964b4b-482a-4d85-a081-7e0625c93056"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.111501 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.111539 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.111552 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.111560 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.111568 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlwgh\" (UniqueName: \"kubernetes.io/projected/74964b4b-482a-4d85-a081-7e0625c93056-kube-api-access-vlwgh\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.127129 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74964b4b-482a-4d85-a081-7e0625c93056" (UID: "74964b4b-482a-4d85-a081-7e0625c93056"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.213721 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74964b4b-482a-4d85-a081-7e0625c93056-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.888662 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-69vds" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.888694 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"239988d8-f0f2-49d2-95aa-2f50d3b1f5ce","Type":"ContainerStarted","Data":"e2e42979ec0a5cdb01a03e28c9193789425be1198a58e193e4aa01c7b220c7db"} Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.889243 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.925803 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.137888185 podStartE2EDuration="5.925785886s" podCreationTimestamp="2025-09-30 18:04:48 +0000 UTC" firstStartedPulling="2025-09-30 18:04:49.644104648 +0000 UTC m=+1340.166603886" lastFinishedPulling="2025-09-30 18:04:53.432002339 +0000 UTC m=+1343.954501587" observedRunningTime="2025-09-30 18:04:53.916942875 +0000 UTC m=+1344.439442103" watchObservedRunningTime="2025-09-30 18:04:53.925785886 +0000 UTC m=+1344.448285114" Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.940182 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-69vds"] Sep 30 18:04:53 crc kubenswrapper[4797]: I0930 18:04:53.952657 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-69vds"] Sep 30 18:04:54 crc kubenswrapper[4797]: I0930 18:04:54.254787 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74964b4b-482a-4d85-a081-7e0625c93056" path="/var/lib/kubelet/pods/74964b4b-482a-4d85-a081-7e0625c93056/volumes" Sep 30 18:04:56 crc kubenswrapper[4797]: I0930 18:04:56.923695 4797 generic.go:334] "Generic (PLEG): container finished" podID="f803581d-a565-4443-8672-c57c027a3af5" containerID="0f3d0d7fe75661e083dcc2e480d29f62590dc9195787680c6756b070718e39c9" exitCode=0 Sep 30 18:04:56 crc kubenswrapper[4797]: I0930 18:04:56.923805 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v7dn7" event={"ID":"f803581d-a565-4443-8672-c57c027a3af5","Type":"ContainerDied","Data":"0f3d0d7fe75661e083dcc2e480d29f62590dc9195787680c6756b070718e39c9"} Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.381286 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.524947 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-combined-ca-bundle\") pod \"f803581d-a565-4443-8672-c57c027a3af5\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.525330 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-config-data\") pod \"f803581d-a565-4443-8672-c57c027a3af5\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.525673 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-scripts\") pod \"f803581d-a565-4443-8672-c57c027a3af5\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.525969 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4knjs\" (UniqueName: \"kubernetes.io/projected/f803581d-a565-4443-8672-c57c027a3af5-kube-api-access-4knjs\") pod \"f803581d-a565-4443-8672-c57c027a3af5\" (UID: \"f803581d-a565-4443-8672-c57c027a3af5\") " Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.532584 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f803581d-a565-4443-8672-c57c027a3af5-kube-api-access-4knjs" (OuterVolumeSpecName: "kube-api-access-4knjs") pod "f803581d-a565-4443-8672-c57c027a3af5" (UID: "f803581d-a565-4443-8672-c57c027a3af5"). InnerVolumeSpecName "kube-api-access-4knjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.533294 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-scripts" (OuterVolumeSpecName: "scripts") pod "f803581d-a565-4443-8672-c57c027a3af5" (UID: "f803581d-a565-4443-8672-c57c027a3af5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.565825 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-config-data" (OuterVolumeSpecName: "config-data") pod "f803581d-a565-4443-8672-c57c027a3af5" (UID: "f803581d-a565-4443-8672-c57c027a3af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.580723 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f803581d-a565-4443-8672-c57c027a3af5" (UID: "f803581d-a565-4443-8672-c57c027a3af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.629049 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.629080 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.629088 4797 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f803581d-a565-4443-8672-c57c027a3af5-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.629097 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4knjs\" (UniqueName: \"kubernetes.io/projected/f803581d-a565-4443-8672-c57c027a3af5-kube-api-access-4knjs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.953421 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v7dn7" event={"ID":"f803581d-a565-4443-8672-c57c027a3af5","Type":"ContainerDied","Data":"9f94441e7728e6daa19dbb6e70db40fe8e846b8fc276255d137833e956454709"} Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.953482 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f94441e7728e6daa19dbb6e70db40fe8e846b8fc276255d137833e956454709" Sep 30 18:04:58 crc kubenswrapper[4797]: I0930 18:04:58.953493 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v7dn7" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.153514 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.154139 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-log" containerID="cri-o://3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43" gracePeriod=30 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.154261 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-api" containerID="cri-o://2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a" gracePeriod=30 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.171847 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.172065 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="733c5145-7413-4b60-9e71-2c96a13cb8b9" containerName="nova-scheduler-scheduler" containerID="cri-o://4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543" gracePeriod=30 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.218712 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.218966 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-log" containerID="cri-o://9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311" gracePeriod=30 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.219093 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-metadata" containerID="cri-o://5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9" gracePeriod=30 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.817018 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957110 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33ee1370-8400-445a-ae36-73ba882e1901-logs\") pod \"33ee1370-8400-445a-ae36-73ba882e1901\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957162 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlw8b\" (UniqueName: \"kubernetes.io/projected/33ee1370-8400-445a-ae36-73ba882e1901-kube-api-access-xlw8b\") pod \"33ee1370-8400-445a-ae36-73ba882e1901\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957222 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-public-tls-certs\") pod \"33ee1370-8400-445a-ae36-73ba882e1901\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957263 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-config-data\") pod \"33ee1370-8400-445a-ae36-73ba882e1901\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957277 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-internal-tls-certs\") pod \"33ee1370-8400-445a-ae36-73ba882e1901\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957310 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-combined-ca-bundle\") pod \"33ee1370-8400-445a-ae36-73ba882e1901\" (UID: \"33ee1370-8400-445a-ae36-73ba882e1901\") " Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.957401 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ee1370-8400-445a-ae36-73ba882e1901-logs" (OuterVolumeSpecName: "logs") pod "33ee1370-8400-445a-ae36-73ba882e1901" (UID: "33ee1370-8400-445a-ae36-73ba882e1901"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.958560 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33ee1370-8400-445a-ae36-73ba882e1901-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.965260 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ee1370-8400-445a-ae36-73ba882e1901-kube-api-access-xlw8b" (OuterVolumeSpecName: "kube-api-access-xlw8b") pod "33ee1370-8400-445a-ae36-73ba882e1901" (UID: "33ee1370-8400-445a-ae36-73ba882e1901"). InnerVolumeSpecName "kube-api-access-xlw8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967388 4797 generic.go:334] "Generic (PLEG): container finished" podID="33ee1370-8400-445a-ae36-73ba882e1901" containerID="2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a" exitCode=0 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967417 4797 generic.go:334] "Generic (PLEG): container finished" podID="33ee1370-8400-445a-ae36-73ba882e1901" containerID="3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43" exitCode=143 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967520 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33ee1370-8400-445a-ae36-73ba882e1901","Type":"ContainerDied","Data":"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a"} Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967548 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33ee1370-8400-445a-ae36-73ba882e1901","Type":"ContainerDied","Data":"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43"} Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967558 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33ee1370-8400-445a-ae36-73ba882e1901","Type":"ContainerDied","Data":"804d8860ca918b127f10426908e1d1f233e338d1267e3d4f50c2e55c705929f0"} Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967556 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.967978 4797 scope.go:117] "RemoveContainer" containerID="2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a" Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.971679 4797 generic.go:334] "Generic (PLEG): container finished" podID="c41bcd62-8f3f-4510-b271-be662ca71353" containerID="9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311" exitCode=143 Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.971709 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c41bcd62-8f3f-4510-b271-be662ca71353","Type":"ContainerDied","Data":"9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311"} Sep 30 18:04:59 crc kubenswrapper[4797]: I0930 18:04:59.989787 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-config-data" (OuterVolumeSpecName: "config-data") pod "33ee1370-8400-445a-ae36-73ba882e1901" (UID: "33ee1370-8400-445a-ae36-73ba882e1901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.022266 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ee1370-8400-445a-ae36-73ba882e1901" (UID: "33ee1370-8400-445a-ae36-73ba882e1901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.028230 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "33ee1370-8400-445a-ae36-73ba882e1901" (UID: "33ee1370-8400-445a-ae36-73ba882e1901"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.033184 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "33ee1370-8400-445a-ae36-73ba882e1901" (UID: "33ee1370-8400-445a-ae36-73ba882e1901"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.060540 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlw8b\" (UniqueName: \"kubernetes.io/projected/33ee1370-8400-445a-ae36-73ba882e1901-kube-api-access-xlw8b\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.060584 4797 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.060597 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.060612 4797 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.060626 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee1370-8400-445a-ae36-73ba882e1901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.159720 4797 scope.go:117] "RemoveContainer" containerID="3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.188648 4797 scope.go:117] "RemoveContainer" containerID="2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a" Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.189242 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a\": container with ID starting with 2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a not found: ID does not exist" containerID="2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.189283 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a"} err="failed to get container status \"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a\": rpc error: code = NotFound desc = could not find container \"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a\": container with ID starting with 2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a not found: ID does not exist" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.189314 4797 scope.go:117] "RemoveContainer" containerID="3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43" Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.189814 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43\": container with ID starting with 3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43 not found: ID does not exist" containerID="3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.189845 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43"} err="failed to get container status \"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43\": rpc error: code = NotFound desc = could not find container \"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43\": container with ID starting with 3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43 not found: ID does not exist" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.189865 4797 scope.go:117] "RemoveContainer" containerID="2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.190117 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a"} err="failed to get container status \"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a\": rpc error: code = NotFound desc = could not find container \"2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a\": container with ID starting with 2b9dfb9a01f58042927b479296f3d0dd2868a8da6f6d60ce4ac10406f9a50a7a not found: ID does not exist" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.190134 4797 scope.go:117] "RemoveContainer" containerID="3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.190318 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43"} err="failed to get container status \"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43\": rpc error: code = NotFound desc = could not find container \"3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43\": container with ID starting with 3a1f49debfd7f67d7680d739a65353196d33b666ace0b7642abf939868739d43 not found: ID does not exist" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.299396 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.307945 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.325481 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.326138 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74964b4b-482a-4d85-a081-7e0625c93056" containerName="dnsmasq-dns" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326176 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="74964b4b-482a-4d85-a081-7e0625c93056" containerName="dnsmasq-dns" Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.326196 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-api" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326205 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-api" Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.326234 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f803581d-a565-4443-8672-c57c027a3af5" containerName="nova-manage" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326242 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f803581d-a565-4443-8672-c57c027a3af5" containerName="nova-manage" Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.326302 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74964b4b-482a-4d85-a081-7e0625c93056" containerName="init" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326310 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="74964b4b-482a-4d85-a081-7e0625c93056" containerName="init" Sep 30 18:05:00 crc kubenswrapper[4797]: E0930 18:05:00.326326 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-log" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326335 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-log" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326626 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-api" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326653 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="74964b4b-482a-4d85-a081-7e0625c93056" containerName="dnsmasq-dns" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326667 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee1370-8400-445a-ae36-73ba882e1901" containerName="nova-api-log" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.326692 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f803581d-a565-4443-8672-c57c027a3af5" containerName="nova-manage" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.328098 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.333975 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.334059 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.334151 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.346807 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.478532 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.478595 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.478631 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-config-data\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.478782 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.478845 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-logs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.479127 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm7pm\" (UniqueName: \"kubernetes.io/projected/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-kube-api-access-nm7pm\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.539114 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580328 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm7pm\" (UniqueName: \"kubernetes.io/projected/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-kube-api-access-nm7pm\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580392 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580449 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580478 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-config-data\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580514 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580538 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-logs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.580982 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-logs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.585946 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.586068 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.589254 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-config-data\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.590968 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.596728 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm7pm\" (UniqueName: \"kubernetes.io/projected/e0b40b7a-54a6-4fb4-868d-85f26823aeb3-kube-api-access-nm7pm\") pod \"nova-api-0\" (UID: \"e0b40b7a-54a6-4fb4-868d-85f26823aeb3\") " pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.652929 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.681688 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-config-data\") pod \"733c5145-7413-4b60-9e71-2c96a13cb8b9\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.682878 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-combined-ca-bundle\") pod \"733c5145-7413-4b60-9e71-2c96a13cb8b9\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.683288 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q2zh\" (UniqueName: \"kubernetes.io/projected/733c5145-7413-4b60-9e71-2c96a13cb8b9-kube-api-access-4q2zh\") pod \"733c5145-7413-4b60-9e71-2c96a13cb8b9\" (UID: \"733c5145-7413-4b60-9e71-2c96a13cb8b9\") " Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.696301 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733c5145-7413-4b60-9e71-2c96a13cb8b9-kube-api-access-4q2zh" (OuterVolumeSpecName: "kube-api-access-4q2zh") pod "733c5145-7413-4b60-9e71-2c96a13cb8b9" (UID: "733c5145-7413-4b60-9e71-2c96a13cb8b9"). InnerVolumeSpecName "kube-api-access-4q2zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.720388 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-config-data" (OuterVolumeSpecName: "config-data") pod "733c5145-7413-4b60-9e71-2c96a13cb8b9" (UID: "733c5145-7413-4b60-9e71-2c96a13cb8b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.720943 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "733c5145-7413-4b60-9e71-2c96a13cb8b9" (UID: "733c5145-7413-4b60-9e71-2c96a13cb8b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.787191 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.787229 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733c5145-7413-4b60-9e71-2c96a13cb8b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.787249 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q2zh\" (UniqueName: \"kubernetes.io/projected/733c5145-7413-4b60-9e71-2c96a13cb8b9-kube-api-access-4q2zh\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.992467 4797 generic.go:334] "Generic (PLEG): container finished" podID="733c5145-7413-4b60-9e71-2c96a13cb8b9" containerID="4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543" exitCode=0 Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.992581 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"733c5145-7413-4b60-9e71-2c96a13cb8b9","Type":"ContainerDied","Data":"4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543"} Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.992609 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"733c5145-7413-4b60-9e71-2c96a13cb8b9","Type":"ContainerDied","Data":"ea02bacceb30db2ef0d8ed6d7c3c7128baf69420071d1352c48d8da953cb29db"} Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.992625 4797 scope.go:117] "RemoveContainer" containerID="4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543" Sep 30 18:05:00 crc kubenswrapper[4797]: I0930 18:05:00.992766 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.030955 4797 scope.go:117] "RemoveContainer" containerID="4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.044025 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.054089 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:05:01 crc kubenswrapper[4797]: E0930 18:05:01.054197 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543\": container with ID starting with 4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543 not found: ID does not exist" containerID="4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.054220 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543"} err="failed to get container status \"4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543\": rpc error: code = NotFound desc = could not find container \"4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543\": container with ID starting with 4ad587b9a0b76d60a70cbf7f73bfdf4ff6c8c09a540e58d250a7a7b6be990543 not found: ID does not exist" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.073211 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:05:01 crc kubenswrapper[4797]: E0930 18:05:01.073650 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733c5145-7413-4b60-9e71-2c96a13cb8b9" containerName="nova-scheduler-scheduler" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.073668 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="733c5145-7413-4b60-9e71-2c96a13cb8b9" containerName="nova-scheduler-scheduler" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.073903 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="733c5145-7413-4b60-9e71-2c96a13cb8b9" containerName="nova-scheduler-scheduler" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.074562 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.076618 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.082868 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.140279 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.193697 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf494\" (UniqueName: \"kubernetes.io/projected/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-kube-api-access-gf494\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.193799 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-config-data\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.193974 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.295974 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf494\" (UniqueName: \"kubernetes.io/projected/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-kube-api-access-gf494\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.296032 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-config-data\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.296109 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.299372 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.299786 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-config-data\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.319537 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf494\" (UniqueName: \"kubernetes.io/projected/fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b-kube-api-access-gf494\") pod \"nova-scheduler-0\" (UID: \"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b\") " pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.394077 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 18:05:01 crc kubenswrapper[4797]: I0930 18:05:01.880608 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 18:05:01 crc kubenswrapper[4797]: W0930 18:05:01.887569 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc0b01e6_a07d_4c6a_9bf5_1bada38ee89b.slice/crio-863e6bb66520d04a61cc8c121bee709094b368d2203b828c49c5c58c551437e0 WatchSource:0}: Error finding container 863e6bb66520d04a61cc8c121bee709094b368d2203b828c49c5c58c551437e0: Status 404 returned error can't find the container with id 863e6bb66520d04a61cc8c121bee709094b368d2203b828c49c5c58c551437e0 Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.032454 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b","Type":"ContainerStarted","Data":"863e6bb66520d04a61cc8c121bee709094b368d2203b828c49c5c58c551437e0"} Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.034144 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0b40b7a-54a6-4fb4-868d-85f26823aeb3","Type":"ContainerStarted","Data":"f455fc2f1ae6f9e7c8bb4e759c699c876e2ea7198f3885ee99d7bddbdacbfd96"} Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.034176 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0b40b7a-54a6-4fb4-868d-85f26823aeb3","Type":"ContainerStarted","Data":"619d74ee6920c2e6315de058a1251f5fb262aed39058d3399af8bddecd50ce28"} Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.034189 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0b40b7a-54a6-4fb4-868d-85f26823aeb3","Type":"ContainerStarted","Data":"3da086de98e394be8a1eb79d4158dc9a5b196c59a3e1f49e8298fe42bf553c3a"} Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.065463 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.065422849 podStartE2EDuration="2.065422849s" podCreationTimestamp="2025-09-30 18:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:05:02.062367846 +0000 UTC m=+1352.584867124" watchObservedRunningTime="2025-09-30 18:05:02.065422849 +0000 UTC m=+1352.587922087" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.252637 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ee1370-8400-445a-ae36-73ba882e1901" path="/var/lib/kubelet/pods/33ee1370-8400-445a-ae36-73ba882e1901/volumes" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.253651 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733c5145-7413-4b60-9e71-2c96a13cb8b9" path="/var/lib/kubelet/pods/733c5145-7413-4b60-9e71-2c96a13cb8b9/volumes" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.356567 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:47258->10.217.0.215:8775: read: connection reset by peer" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.356660 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:47248->10.217.0.215:8775: read: connection reset by peer" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.859107 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.929230 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-combined-ca-bundle\") pod \"c41bcd62-8f3f-4510-b271-be662ca71353\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.929351 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwq89\" (UniqueName: \"kubernetes.io/projected/c41bcd62-8f3f-4510-b271-be662ca71353-kube-api-access-bwq89\") pod \"c41bcd62-8f3f-4510-b271-be662ca71353\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.929373 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-nova-metadata-tls-certs\") pod \"c41bcd62-8f3f-4510-b271-be662ca71353\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.929403 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-config-data\") pod \"c41bcd62-8f3f-4510-b271-be662ca71353\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.929483 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41bcd62-8f3f-4510-b271-be662ca71353-logs\") pod \"c41bcd62-8f3f-4510-b271-be662ca71353\" (UID: \"c41bcd62-8f3f-4510-b271-be662ca71353\") " Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.930330 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41bcd62-8f3f-4510-b271-be662ca71353-logs" (OuterVolumeSpecName: "logs") pod "c41bcd62-8f3f-4510-b271-be662ca71353" (UID: "c41bcd62-8f3f-4510-b271-be662ca71353"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.947866 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41bcd62-8f3f-4510-b271-be662ca71353-kube-api-access-bwq89" (OuterVolumeSpecName: "kube-api-access-bwq89") pod "c41bcd62-8f3f-4510-b271-be662ca71353" (UID: "c41bcd62-8f3f-4510-b271-be662ca71353"). InnerVolumeSpecName "kube-api-access-bwq89". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:02 crc kubenswrapper[4797]: I0930 18:05:02.968020 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-config-data" (OuterVolumeSpecName: "config-data") pod "c41bcd62-8f3f-4510-b271-be662ca71353" (UID: "c41bcd62-8f3f-4510-b271-be662ca71353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.013173 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c41bcd62-8f3f-4510-b271-be662ca71353" (UID: "c41bcd62-8f3f-4510-b271-be662ca71353"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.019739 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c41bcd62-8f3f-4510-b271-be662ca71353" (UID: "c41bcd62-8f3f-4510-b271-be662ca71353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.034092 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.034137 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwq89\" (UniqueName: \"kubernetes.io/projected/c41bcd62-8f3f-4510-b271-be662ca71353-kube-api-access-bwq89\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.034152 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.034164 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bcd62-8f3f-4510-b271-be662ca71353-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.034176 4797 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41bcd62-8f3f-4510-b271-be662ca71353-logs\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.067592 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b","Type":"ContainerStarted","Data":"cd33070db7ee7e0e7788fb4830aef91b727c0df838f9b6965e72edee5ae858cf"} Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.071411 4797 generic.go:334] "Generic (PLEG): container finished" podID="c41bcd62-8f3f-4510-b271-be662ca71353" containerID="5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9" exitCode=0 Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.071747 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.071889 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c41bcd62-8f3f-4510-b271-be662ca71353","Type":"ContainerDied","Data":"5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9"} Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.072015 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c41bcd62-8f3f-4510-b271-be662ca71353","Type":"ContainerDied","Data":"04042c4284c1bb3b27549d2efb41c3e0339d9bdea98ee1f711a5607197cfec57"} Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.072041 4797 scope.go:117] "RemoveContainer" containerID="5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.092619 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.09242068 podStartE2EDuration="2.09242068s" podCreationTimestamp="2025-09-30 18:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:05:03.08468836 +0000 UTC m=+1353.607187598" watchObservedRunningTime="2025-09-30 18:05:03.09242068 +0000 UTC m=+1353.614919918" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.120306 4797 scope.go:117] "RemoveContainer" containerID="9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.129665 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.146510 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.162768 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:05:03 crc kubenswrapper[4797]: E0930 18:05:03.163199 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-log" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.163211 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-log" Sep 30 18:05:03 crc kubenswrapper[4797]: E0930 18:05:03.163256 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-metadata" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.163265 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-metadata" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.163483 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-log" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.163512 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" containerName="nova-metadata-metadata" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.164654 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.165542 4797 scope.go:117] "RemoveContainer" containerID="5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9" Sep 30 18:05:03 crc kubenswrapper[4797]: E0930 18:05:03.165990 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9\": container with ID starting with 5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9 not found: ID does not exist" containerID="5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.166012 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9"} err="failed to get container status \"5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9\": rpc error: code = NotFound desc = could not find container \"5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9\": container with ID starting with 5c8804d759259849cfdde92cca6228b557edf40ee5192fa045345dbd2cc787e9 not found: ID does not exist" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.166031 4797 scope.go:117] "RemoveContainer" containerID="9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311" Sep 30 18:05:03 crc kubenswrapper[4797]: E0930 18:05:03.166213 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311\": container with ID starting with 9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311 not found: ID does not exist" containerID="9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.166228 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311"} err="failed to get container status \"9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311\": rpc error: code = NotFound desc = could not find container \"9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311\": container with ID starting with 9a8494d8b113ddb3d32c26e9000cc8f117fcc808cdc834db71c248af4ab84311 not found: ID does not exist" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.171645 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.171809 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.190672 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.245134 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgrn\" (UniqueName: \"kubernetes.io/projected/4225b925-9d11-4b4b-8e2d-1063584ef26c-kube-api-access-pdgrn\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.245188 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.245287 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-config-data\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.245360 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4225b925-9d11-4b4b-8e2d-1063584ef26c-logs\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.245424 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.347495 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.347535 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgrn\" (UniqueName: \"kubernetes.io/projected/4225b925-9d11-4b4b-8e2d-1063584ef26c-kube-api-access-pdgrn\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.347559 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.347638 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-config-data\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.347706 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4225b925-9d11-4b4b-8e2d-1063584ef26c-logs\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.348060 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4225b925-9d11-4b4b-8e2d-1063584ef26c-logs\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.354653 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-config-data\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.354921 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.355121 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4225b925-9d11-4b4b-8e2d-1063584ef26c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.367864 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgrn\" (UniqueName: \"kubernetes.io/projected/4225b925-9d11-4b4b-8e2d-1063584ef26c-kube-api-access-pdgrn\") pod \"nova-metadata-0\" (UID: \"4225b925-9d11-4b4b-8e2d-1063584ef26c\") " pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.507061 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 18:05:03 crc kubenswrapper[4797]: I0930 18:05:03.966589 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 18:05:04 crc kubenswrapper[4797]: I0930 18:05:04.084200 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4225b925-9d11-4b4b-8e2d-1063584ef26c","Type":"ContainerStarted","Data":"bfabd376be93a0993b16710905df00454dbf9836dd51dd828bfdeed7be0700c7"} Sep 30 18:05:04 crc kubenswrapper[4797]: I0930 18:05:04.256011 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41bcd62-8f3f-4510-b271-be662ca71353" path="/var/lib/kubelet/pods/c41bcd62-8f3f-4510-b271-be662ca71353/volumes" Sep 30 18:05:05 crc kubenswrapper[4797]: I0930 18:05:05.101854 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4225b925-9d11-4b4b-8e2d-1063584ef26c","Type":"ContainerStarted","Data":"0735ee62b4e36eae44645545e33403ad9f9748cafd3150ab38ddc14f721e8d50"} Sep 30 18:05:05 crc kubenswrapper[4797]: I0930 18:05:05.102297 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4225b925-9d11-4b4b-8e2d-1063584ef26c","Type":"ContainerStarted","Data":"19ce7d9666f8bf36711052e0c0480ca76245d2437b93b5d06940d9d76660d158"} Sep 30 18:05:05 crc kubenswrapper[4797]: I0930 18:05:05.128882 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.128863285 podStartE2EDuration="2.128863285s" podCreationTimestamp="2025-09-30 18:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:05:05.128120235 +0000 UTC m=+1355.650619503" watchObservedRunningTime="2025-09-30 18:05:05.128863285 +0000 UTC m=+1355.651362523" Sep 30 18:05:06 crc kubenswrapper[4797]: I0930 18:05:06.394929 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 18:05:08 crc kubenswrapper[4797]: I0930 18:05:08.514721 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 18:05:08 crc kubenswrapper[4797]: I0930 18:05:08.515163 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 18:05:10 crc kubenswrapper[4797]: I0930 18:05:10.654397 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 18:05:10 crc kubenswrapper[4797]: I0930 18:05:10.656230 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 18:05:11 crc kubenswrapper[4797]: I0930 18:05:11.395165 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 18:05:11 crc kubenswrapper[4797]: I0930 18:05:11.438890 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 18:05:11 crc kubenswrapper[4797]: I0930 18:05:11.674859 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e0b40b7a-54a6-4fb4-868d-85f26823aeb3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:05:11 crc kubenswrapper[4797]: I0930 18:05:11.674866 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e0b40b7a-54a6-4fb4-868d-85f26823aeb3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:05:12 crc kubenswrapper[4797]: I0930 18:05:12.220881 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 18:05:13 crc kubenswrapper[4797]: I0930 18:05:13.508156 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 18:05:13 crc kubenswrapper[4797]: I0930 18:05:13.508239 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.191942 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.192316 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.192379 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.193276 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2917b8990bf356e8e1ce6fe1b3ce1f29f0b790b2ad003c6a1e85f4a96a1de3ae"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.193348 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://2917b8990bf356e8e1ce6fe1b3ce1f29f0b790b2ad003c6a1e85f4a96a1de3ae" gracePeriod=600 Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.524731 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4225b925-9d11-4b4b-8e2d-1063584ef26c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.228:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:05:14 crc kubenswrapper[4797]: I0930 18:05:14.524868 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4225b925-9d11-4b4b-8e2d-1063584ef26c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.228:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 18:05:15 crc kubenswrapper[4797]: I0930 18:05:15.226248 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="2917b8990bf356e8e1ce6fe1b3ce1f29f0b790b2ad003c6a1e85f4a96a1de3ae" exitCode=0 Sep 30 18:05:15 crc kubenswrapper[4797]: I0930 18:05:15.226453 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"2917b8990bf356e8e1ce6fe1b3ce1f29f0b790b2ad003c6a1e85f4a96a1de3ae"} Sep 30 18:05:15 crc kubenswrapper[4797]: I0930 18:05:15.226659 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b"} Sep 30 18:05:15 crc kubenswrapper[4797]: I0930 18:05:15.226684 4797 scope.go:117] "RemoveContainer" containerID="f48f1c21375195dc21057b3f6223be6922c12af920895f2ea187fde9415ae1df" Sep 30 18:05:19 crc kubenswrapper[4797]: I0930 18:05:19.167947 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 18:05:20 crc kubenswrapper[4797]: I0930 18:05:20.661580 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 18:05:20 crc kubenswrapper[4797]: I0930 18:05:20.662100 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 18:05:20 crc kubenswrapper[4797]: I0930 18:05:20.662522 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 18:05:20 crc kubenswrapper[4797]: I0930 18:05:20.667144 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 18:05:21 crc kubenswrapper[4797]: I0930 18:05:21.298877 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 18:05:21 crc kubenswrapper[4797]: I0930 18:05:21.312786 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 18:05:23 crc kubenswrapper[4797]: I0930 18:05:23.518890 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 18:05:23 crc kubenswrapper[4797]: I0930 18:05:23.519272 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 18:05:23 crc kubenswrapper[4797]: I0930 18:05:23.531852 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 18:05:23 crc kubenswrapper[4797]: I0930 18:05:23.535855 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 18:05:32 crc kubenswrapper[4797]: I0930 18:05:32.002942 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 18:05:33 crc kubenswrapper[4797]: I0930 18:05:33.140917 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 18:05:36 crc kubenswrapper[4797]: I0930 18:05:36.154751 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="rabbitmq" containerID="cri-o://df96003e9fe01c2d21ccf4fa5f58d3eaab27428f49a64ad0f4d986716752d52e" gracePeriod=604796 Sep 30 18:05:37 crc kubenswrapper[4797]: I0930 18:05:37.318410 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="rabbitmq" containerID="cri-o://4c7e640eaf85c51f57b6cf5bb940418082a77d3851b444a2154f6865b2d5de23" gracePeriod=604796 Sep 30 18:05:38 crc kubenswrapper[4797]: I0930 18:05:38.365630 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Sep 30 18:05:38 crc kubenswrapper[4797]: I0930 18:05:38.620944 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.542692 4797 generic.go:334] "Generic (PLEG): container finished" podID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerID="df96003e9fe01c2d21ccf4fa5f58d3eaab27428f49a64ad0f4d986716752d52e" exitCode=0 Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.542774 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"65610b42-1ed9-4a27-996a-09e0ebd560e5","Type":"ContainerDied","Data":"df96003e9fe01c2d21ccf4fa5f58d3eaab27428f49a64ad0f4d986716752d52e"} Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.861926 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899612 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65610b42-1ed9-4a27-996a-09e0ebd560e5-erlang-cookie-secret\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899689 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-tls\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899760 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65610b42-1ed9-4a27-996a-09e0ebd560e5-pod-info\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899801 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-plugins-conf\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899867 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899899 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-server-conf\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.899956 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-confd\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.900017 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-plugins\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.900093 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-config-data\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.900150 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-erlang-cookie\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.900196 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkth4\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-kube-api-access-mkth4\") pod \"65610b42-1ed9-4a27-996a-09e0ebd560e5\" (UID: \"65610b42-1ed9-4a27-996a-09e0ebd560e5\") " Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.926552 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.930172 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.933361 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/65610b42-1ed9-4a27-996a-09e0ebd560e5-pod-info" (OuterVolumeSpecName: "pod-info") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.934683 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-kube-api-access-mkth4" (OuterVolumeSpecName: "kube-api-access-mkth4") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "kube-api-access-mkth4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.935063 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.935741 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65610b42-1ed9-4a27-996a-09e0ebd560e5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.935972 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:05:42 crc kubenswrapper[4797]: I0930 18:05:42.947925 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.000401 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-config-data" (OuterVolumeSpecName: "config-data") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007343 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007372 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkth4\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-kube-api-access-mkth4\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007382 4797 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65610b42-1ed9-4a27-996a-09e0ebd560e5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007391 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007400 4797 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65610b42-1ed9-4a27-996a-09e0ebd560e5-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007410 4797 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007507 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007516 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.007524 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.040197 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-server-conf" (OuterVolumeSpecName: "server-conf") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.064563 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.082652 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "65610b42-1ed9-4a27-996a-09e0ebd560e5" (UID: "65610b42-1ed9-4a27-996a-09e0ebd560e5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.112509 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.112588 4797 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65610b42-1ed9-4a27-996a-09e0ebd560e5-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.112604 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65610b42-1ed9-4a27-996a-09e0ebd560e5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.560673 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.560723 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"65610b42-1ed9-4a27-996a-09e0ebd560e5","Type":"ContainerDied","Data":"d9e349f7b742a7281d6b3dac55b1fcbb11bf69f22d50fa610428485307bf03bd"} Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.561654 4797 scope.go:117] "RemoveContainer" containerID="df96003e9fe01c2d21ccf4fa5f58d3eaab27428f49a64ad0f4d986716752d52e" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.568913 4797 generic.go:334] "Generic (PLEG): container finished" podID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerID="4c7e640eaf85c51f57b6cf5bb940418082a77d3851b444a2154f6865b2d5de23" exitCode=0 Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.568954 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a69c5e9-777c-48ad-8af7-78e770d2a9b2","Type":"ContainerDied","Data":"4c7e640eaf85c51f57b6cf5bb940418082a77d3851b444a2154f6865b2d5de23"} Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.641342 4797 scope.go:117] "RemoveContainer" containerID="32d899fb0b4dcce53cc91d0ae0c0891dd0383d1706912339fb8c5a31d211f1b7" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.643070 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.685970 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.706550 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 18:05:43 crc kubenswrapper[4797]: E0930 18:05:43.707055 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="setup-container" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.707075 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="setup-container" Sep 30 18:05:43 crc kubenswrapper[4797]: E0930 18:05:43.707096 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="rabbitmq" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.707103 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="rabbitmq" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.707338 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" containerName="rabbitmq" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.709077 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.711200 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.712122 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.712426 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.712566 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.712751 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-spc9v" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.716130 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.716185 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.721233 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.824827 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.825425 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-config-data\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.825549 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.825635 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.827028 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.827306 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.828139 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrnm\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-kube-api-access-bgrnm\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.828278 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.828494 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.828613 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.828738 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930408 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930480 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930500 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930524 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930593 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930610 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-config-data\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930625 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930641 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930677 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930716 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.930750 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrnm\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-kube-api-access-bgrnm\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.931003 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.932060 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.932855 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-config-data\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.933510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.933807 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.934143 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.938162 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.938658 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.944344 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.948493 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.950806 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:43 crc kubenswrapper[4797]: I0930 18:05:43.952683 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrnm\" (UniqueName: \"kubernetes.io/projected/c7ffa7d5-6c5e-4d12-beb4-beca118f83d5-kube-api-access-bgrnm\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.032973 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-plugins\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033129 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033173 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-pod-info\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033217 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-server-conf\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033270 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-confd\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033391 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-tls\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033510 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-plugins-conf\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033653 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49krn\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-kube-api-access-49krn\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033794 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-erlang-cookie\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033837 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-config-data\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.033881 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-erlang-cookie-secret\") pod \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\" (UID: \"6a69c5e9-777c-48ad-8af7-78e770d2a9b2\") " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.040641 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-pod-info" (OuterVolumeSpecName: "pod-info") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.041539 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.041720 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.043293 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.044676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-kube-api-access-49krn" (OuterVolumeSpecName: "kube-api-access-49krn") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "kube-api-access-49krn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.053319 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.054713 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5\") " pod="openstack/rabbitmq-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.065280 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.072323 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.099655 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-config-data" (OuterVolumeSpecName: "config-data") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137056 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137088 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137102 4797 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137112 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137142 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137152 4797 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137160 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137168 4797 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.137179 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49krn\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-kube-api-access-49krn\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.173756 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.207686 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-server-conf" (OuterVolumeSpecName: "server-conf") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.240722 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.240755 4797 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.273039 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65610b42-1ed9-4a27-996a-09e0ebd560e5" path="/var/lib/kubelet/pods/65610b42-1ed9-4a27-996a-09e0ebd560e5/volumes" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.286814 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6a69c5e9-777c-48ad-8af7-78e770d2a9b2" (UID: "6a69c5e9-777c-48ad-8af7-78e770d2a9b2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.344849 4797 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a69c5e9-777c-48ad-8af7-78e770d2a9b2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.354996 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.588640 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a69c5e9-777c-48ad-8af7-78e770d2a9b2","Type":"ContainerDied","Data":"6ab52c034f0b6d1519cfa788b0a1bb05d7062840b05df53c1b5a2a93fe212dea"} Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.588918 4797 scope.go:117] "RemoveContainer" containerID="4c7e640eaf85c51f57b6cf5bb940418082a77d3851b444a2154f6865b2d5de23" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.588704 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.618397 4797 scope.go:117] "RemoveContainer" containerID="ddea9aed82f65ba1df21799ed2ffbae0368261d2ae6cbde18121dc443bad437c" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.627390 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.638498 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.673310 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 18:05:44 crc kubenswrapper[4797]: E0930 18:05:44.674103 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="setup-container" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.674121 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="setup-container" Sep 30 18:05:44 crc kubenswrapper[4797]: E0930 18:05:44.674142 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="rabbitmq" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.674149 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="rabbitmq" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.674375 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" containerName="rabbitmq" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.675465 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.678841 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.678993 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.679049 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.679242 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.679286 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.679466 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.679799 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lx7d9" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.695251 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.759849 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.759910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.759975 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.759994 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760013 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760035 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760051 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760076 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760092 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg4r\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-kube-api-access-jkg4r\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760149 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.760200 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861372 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861477 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861501 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861530 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861590 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861607 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861628 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861655 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861676 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861700 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.861718 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg4r\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-kube-api-access-jkg4r\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.862135 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.862652 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.862968 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.863239 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.863291 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.863312 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.871752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.873474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.878655 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.878785 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.891331 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.897472 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg4r\" (UniqueName: \"kubernetes.io/projected/399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d-kube-api-access-jkg4r\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:44 crc kubenswrapper[4797]: I0930 18:05:44.932553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:45 crc kubenswrapper[4797]: I0930 18:05:45.019162 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:05:45 crc kubenswrapper[4797]: I0930 18:05:45.523711 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 18:05:45 crc kubenswrapper[4797]: I0930 18:05:45.602419 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d","Type":"ContainerStarted","Data":"55f201bbddc894ca126ba3505c293fde4ee9562b043d61dc8b69b29dd219b342"} Sep 30 18:05:45 crc kubenswrapper[4797]: I0930 18:05:45.604116 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5","Type":"ContainerStarted","Data":"e4d87fc2751fdece573359ae8c7d445bbe48fab8c71176b239f2aecd2de85a46"} Sep 30 18:05:45 crc kubenswrapper[4797]: I0930 18:05:45.604141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5","Type":"ContainerStarted","Data":"18095da2b9ad8a906c9d37854d63083aba749aa06b0bbbcf712dc607e1174b34"} Sep 30 18:05:46 crc kubenswrapper[4797]: I0930 18:05:46.254961 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a69c5e9-777c-48ad-8af7-78e770d2a9b2" path="/var/lib/kubelet/pods/6a69c5e9-777c-48ad-8af7-78e770d2a9b2/volumes" Sep 30 18:05:46 crc kubenswrapper[4797]: I0930 18:05:46.617305 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d","Type":"ContainerStarted","Data":"108839811c0813b23880f17c942262cbc69345fcaae5a324ee412d1b5949a4ee"} Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.180860 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-djf8r"] Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.183803 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.193290 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.212885 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-djf8r"] Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.317928 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sq76\" (UniqueName: \"kubernetes.io/projected/d603263e-59e2-4643-9b6c-6c7a605c3226-kube-api-access-8sq76\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.318198 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-config\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.318260 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.318288 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.318323 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.318349 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.318417 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-svc\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420089 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sq76\" (UniqueName: \"kubernetes.io/projected/d603263e-59e2-4643-9b6c-6c7a605c3226-kube-api-access-8sq76\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420133 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-config\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420197 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420224 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420265 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420290 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.420354 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-svc\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.421368 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-svc\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.421484 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.421648 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.421748 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.421911 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-config\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.422335 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.451702 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sq76\" (UniqueName: \"kubernetes.io/projected/d603263e-59e2-4643-9b6c-6c7a605c3226-kube-api-access-8sq76\") pod \"dnsmasq-dns-67b789f86c-djf8r\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.514307 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:47 crc kubenswrapper[4797]: I0930 18:05:47.995078 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-djf8r"] Sep 30 18:05:48 crc kubenswrapper[4797]: I0930 18:05:48.649104 4797 generic.go:334] "Generic (PLEG): container finished" podID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerID="e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08" exitCode=0 Sep 30 18:05:48 crc kubenswrapper[4797]: I0930 18:05:48.649345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" event={"ID":"d603263e-59e2-4643-9b6c-6c7a605c3226","Type":"ContainerDied","Data":"e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08"} Sep 30 18:05:48 crc kubenswrapper[4797]: I0930 18:05:48.649406 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" event={"ID":"d603263e-59e2-4643-9b6c-6c7a605c3226","Type":"ContainerStarted","Data":"1c0926eb09193d6061b9353c10d198481ac88ced2fd2ac2d7f3d61b88c9f0fdb"} Sep 30 18:05:49 crc kubenswrapper[4797]: I0930 18:05:49.660008 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" event={"ID":"d603263e-59e2-4643-9b6c-6c7a605c3226","Type":"ContainerStarted","Data":"517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc"} Sep 30 18:05:49 crc kubenswrapper[4797]: I0930 18:05:49.660957 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:49 crc kubenswrapper[4797]: I0930 18:05:49.695457 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" podStartSLOduration=2.695443026 podStartE2EDuration="2.695443026s" podCreationTimestamp="2025-09-30 18:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:05:49.693911185 +0000 UTC m=+1400.216410423" watchObservedRunningTime="2025-09-30 18:05:49.695443026 +0000 UTC m=+1400.217942264" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.234220 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pr2kx"] Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.237488 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.252283 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pr2kx"] Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.286760 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-catalog-content\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.286905 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-utilities\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.286981 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbxf\" (UniqueName: \"kubernetes.io/projected/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-kube-api-access-mpbxf\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.388809 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-utilities\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.388911 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbxf\" (UniqueName: \"kubernetes.io/projected/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-kube-api-access-mpbxf\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.389016 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-catalog-content\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.389572 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-catalog-content\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.390178 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-utilities\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.420173 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbxf\" (UniqueName: \"kubernetes.io/projected/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-kube-api-access-mpbxf\") pod \"redhat-operators-pr2kx\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.568997 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:05:55 crc kubenswrapper[4797]: I0930 18:05:55.895254 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pr2kx"] Sep 30 18:05:56 crc kubenswrapper[4797]: I0930 18:05:56.752233 4797 generic.go:334] "Generic (PLEG): container finished" podID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerID="bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878" exitCode=0 Sep 30 18:05:56 crc kubenswrapper[4797]: I0930 18:05:56.752309 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr2kx" event={"ID":"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83","Type":"ContainerDied","Data":"bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878"} Sep 30 18:05:56 crc kubenswrapper[4797]: I0930 18:05:56.752627 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr2kx" event={"ID":"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83","Type":"ContainerStarted","Data":"8c9021667bf95fbba75b9f4a3edaab411508f551b6070f62aaf4b4cc13132553"} Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.516686 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.607276 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-vqf8r"] Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.607680 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerName="dnsmasq-dns" containerID="cri-o://e60736780c3b04991c1feb25978907608a50de80fac89989504fa0954d452009" gracePeriod=10 Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.781162 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-sc66n"] Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.782486 4797 generic.go:334] "Generic (PLEG): container finished" podID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerID="e60736780c3b04991c1feb25978907608a50de80fac89989504fa0954d452009" exitCode=0 Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.783180 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" event={"ID":"1494eb69-c2b0-489c-a348-dfa4047c09db","Type":"ContainerDied","Data":"e60736780c3b04991c1feb25978907608a50de80fac89989504fa0954d452009"} Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.783299 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.805264 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-sc66n"] Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.889592 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-config\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.891173 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.891285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.891576 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.891990 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.892025 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbgdr\" (UniqueName: \"kubernetes.io/projected/28db4edb-04c5-44de-917b-8578fa6c4031-kube-api-access-kbgdr\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.892099 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.993848 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.995346 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbgdr\" (UniqueName: \"kubernetes.io/projected/28db4edb-04c5-44de-917b-8578fa6c4031-kube-api-access-kbgdr\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.995294 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.995463 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.995510 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-config\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.996540 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.996584 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-config\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.996897 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.997070 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.997083 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.997199 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.997692 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:57 crc kubenswrapper[4797]: I0930 18:05:57.998279 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28db4edb-04c5-44de-917b-8578fa6c4031-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.017535 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbgdr\" (UniqueName: \"kubernetes.io/projected/28db4edb-04c5-44de-917b-8578fa6c4031-kube-api-access-kbgdr\") pod \"dnsmasq-dns-7fd9f947b7-sc66n\" (UID: \"28db4edb-04c5-44de-917b-8578fa6c4031\") " pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.173110 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.194626 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.301155 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjd99\" (UniqueName: \"kubernetes.io/projected/1494eb69-c2b0-489c-a348-dfa4047c09db-kube-api-access-mjd99\") pod \"1494eb69-c2b0-489c-a348-dfa4047c09db\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.301408 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-svc\") pod \"1494eb69-c2b0-489c-a348-dfa4047c09db\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.301539 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-config\") pod \"1494eb69-c2b0-489c-a348-dfa4047c09db\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.301575 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-nb\") pod \"1494eb69-c2b0-489c-a348-dfa4047c09db\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.301673 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-swift-storage-0\") pod \"1494eb69-c2b0-489c-a348-dfa4047c09db\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.301908 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-sb\") pod \"1494eb69-c2b0-489c-a348-dfa4047c09db\" (UID: \"1494eb69-c2b0-489c-a348-dfa4047c09db\") " Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.307036 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1494eb69-c2b0-489c-a348-dfa4047c09db-kube-api-access-mjd99" (OuterVolumeSpecName: "kube-api-access-mjd99") pod "1494eb69-c2b0-489c-a348-dfa4047c09db" (UID: "1494eb69-c2b0-489c-a348-dfa4047c09db"). InnerVolumeSpecName "kube-api-access-mjd99". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.363487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1494eb69-c2b0-489c-a348-dfa4047c09db" (UID: "1494eb69-c2b0-489c-a348-dfa4047c09db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.374902 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-config" (OuterVolumeSpecName: "config") pod "1494eb69-c2b0-489c-a348-dfa4047c09db" (UID: "1494eb69-c2b0-489c-a348-dfa4047c09db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.380027 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1494eb69-c2b0-489c-a348-dfa4047c09db" (UID: "1494eb69-c2b0-489c-a348-dfa4047c09db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.404039 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.404079 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.404089 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.404101 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjd99\" (UniqueName: \"kubernetes.io/projected/1494eb69-c2b0-489c-a348-dfa4047c09db-kube-api-access-mjd99\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.406814 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1494eb69-c2b0-489c-a348-dfa4047c09db" (UID: "1494eb69-c2b0-489c-a348-dfa4047c09db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.417134 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1494eb69-c2b0-489c-a348-dfa4047c09db" (UID: "1494eb69-c2b0-489c-a348-dfa4047c09db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.506495 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.506845 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1494eb69-c2b0-489c-a348-dfa4047c09db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.654689 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-sc66n"] Sep 30 18:05:58 crc kubenswrapper[4797]: W0930 18:05:58.662309 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28db4edb_04c5_44de_917b_8578fa6c4031.slice/crio-ac41db4eb90bf4fd99cfdfedbebf3c1af34035ee2af7e724e664659e43c652e7 WatchSource:0}: Error finding container ac41db4eb90bf4fd99cfdfedbebf3c1af34035ee2af7e724e664659e43c652e7: Status 404 returned error can't find the container with id ac41db4eb90bf4fd99cfdfedbebf3c1af34035ee2af7e724e664659e43c652e7 Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.799561 4797 generic.go:334] "Generic (PLEG): container finished" podID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerID="d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251" exitCode=0 Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.799641 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr2kx" event={"ID":"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83","Type":"ContainerDied","Data":"d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251"} Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.809248 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" event={"ID":"1494eb69-c2b0-489c-a348-dfa4047c09db","Type":"ContainerDied","Data":"4b235b6f5c96ea7558a10d76aabf48b21899ee5eb4dca4cff686fd25a6c6ae34"} Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.809308 4797 scope.go:117] "RemoveContainer" containerID="e60736780c3b04991c1feb25978907608a50de80fac89989504fa0954d452009" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.809497 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-vqf8r" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.825682 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" event={"ID":"28db4edb-04c5-44de-917b-8578fa6c4031","Type":"ContainerStarted","Data":"ac41db4eb90bf4fd99cfdfedbebf3c1af34035ee2af7e724e664659e43c652e7"} Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.880087 4797 scope.go:117] "RemoveContainer" containerID="fcb3b81368cfdd97e74b410c964dbfbdbb87d0936a08eeb9ff34372e922835bf" Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.910063 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-vqf8r"] Sep 30 18:05:58 crc kubenswrapper[4797]: I0930 18:05:58.920291 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-vqf8r"] Sep 30 18:05:59 crc kubenswrapper[4797]: I0930 18:05:59.840125 4797 generic.go:334] "Generic (PLEG): container finished" podID="28db4edb-04c5-44de-917b-8578fa6c4031" containerID="6d51a194ff52cf3195964e9129050c9919511f640335a1d7ba313ca94043213c" exitCode=0 Sep 30 18:05:59 crc kubenswrapper[4797]: I0930 18:05:59.840465 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" event={"ID":"28db4edb-04c5-44de-917b-8578fa6c4031","Type":"ContainerDied","Data":"6d51a194ff52cf3195964e9129050c9919511f640335a1d7ba313ca94043213c"} Sep 30 18:05:59 crc kubenswrapper[4797]: I0930 18:05:59.845761 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr2kx" event={"ID":"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83","Type":"ContainerStarted","Data":"7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a"} Sep 30 18:05:59 crc kubenswrapper[4797]: I0930 18:05:59.907704 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pr2kx" podStartSLOduration=2.27899495 podStartE2EDuration="4.90768553s" podCreationTimestamp="2025-09-30 18:05:55 +0000 UTC" firstStartedPulling="2025-09-30 18:05:56.754534315 +0000 UTC m=+1407.277033563" lastFinishedPulling="2025-09-30 18:05:59.383224905 +0000 UTC m=+1409.905724143" observedRunningTime="2025-09-30 18:05:59.884166538 +0000 UTC m=+1410.406665796" watchObservedRunningTime="2025-09-30 18:05:59.90768553 +0000 UTC m=+1410.430184768" Sep 30 18:06:00 crc kubenswrapper[4797]: I0930 18:06:00.252726 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" path="/var/lib/kubelet/pods/1494eb69-c2b0-489c-a348-dfa4047c09db/volumes" Sep 30 18:06:00 crc kubenswrapper[4797]: I0930 18:06:00.860426 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" event={"ID":"28db4edb-04c5-44de-917b-8578fa6c4031","Type":"ContainerStarted","Data":"699b2e8dcc5040e53bab20961f805329a8ae8d02e08bff9bd845ef8417e3a69d"} Sep 30 18:06:00 crc kubenswrapper[4797]: I0930 18:06:00.893752 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" podStartSLOduration=3.8937257240000003 podStartE2EDuration="3.893725724s" podCreationTimestamp="2025-09-30 18:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:06:00.883722662 +0000 UTC m=+1411.406221910" watchObservedRunningTime="2025-09-30 18:06:00.893725724 +0000 UTC m=+1411.416224992" Sep 30 18:06:01 crc kubenswrapper[4797]: I0930 18:06:01.871167 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:06:05 crc kubenswrapper[4797]: I0930 18:06:05.569163 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:06:05 crc kubenswrapper[4797]: I0930 18:06:05.569732 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:06:05 crc kubenswrapper[4797]: I0930 18:06:05.647982 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:06:05 crc kubenswrapper[4797]: I0930 18:06:05.987004 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:06:06 crc kubenswrapper[4797]: I0930 18:06:06.047666 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pr2kx"] Sep 30 18:06:07 crc kubenswrapper[4797]: I0930 18:06:07.933062 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pr2kx" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="registry-server" containerID="cri-o://7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a" gracePeriod=2 Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.174628 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd9f947b7-sc66n" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.267081 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-djf8r"] Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.267319 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerName="dnsmasq-dns" containerID="cri-o://517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc" gracePeriod=10 Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.489494 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.520566 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpbxf\" (UniqueName: \"kubernetes.io/projected/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-kube-api-access-mpbxf\") pod \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.520695 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-utilities\") pod \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.520726 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-catalog-content\") pod \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\" (UID: \"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.526864 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-utilities" (OuterVolumeSpecName: "utilities") pod "206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" (UID: "206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.539952 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-kube-api-access-mpbxf" (OuterVolumeSpecName: "kube-api-access-mpbxf") pod "206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" (UID: "206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83"). InnerVolumeSpecName "kube-api-access-mpbxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.623931 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpbxf\" (UniqueName: \"kubernetes.io/projected/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-kube-api-access-mpbxf\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.623978 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.645751 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" (UID: "206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.727447 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.837685 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930375 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-swift-storage-0\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930516 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-config\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930610 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-nb\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930666 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-sb\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930713 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-svc\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930842 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sq76\" (UniqueName: \"kubernetes.io/projected/d603263e-59e2-4643-9b6c-6c7a605c3226-kube-api-access-8sq76\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.930868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-openstack-edpm-ipam\") pod \"d603263e-59e2-4643-9b6c-6c7a605c3226\" (UID: \"d603263e-59e2-4643-9b6c-6c7a605c3226\") " Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.935891 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d603263e-59e2-4643-9b6c-6c7a605c3226-kube-api-access-8sq76" (OuterVolumeSpecName: "kube-api-access-8sq76") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "kube-api-access-8sq76". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.948856 4797 generic.go:334] "Generic (PLEG): container finished" podID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerID="517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc" exitCode=0 Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.949038 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.950376 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" event={"ID":"d603263e-59e2-4643-9b6c-6c7a605c3226","Type":"ContainerDied","Data":"517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc"} Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.951309 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-djf8r" event={"ID":"d603263e-59e2-4643-9b6c-6c7a605c3226","Type":"ContainerDied","Data":"1c0926eb09193d6061b9353c10d198481ac88ced2fd2ac2d7f3d61b88c9f0fdb"} Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.951345 4797 scope.go:117] "RemoveContainer" containerID="517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.955564 4797 generic.go:334] "Generic (PLEG): container finished" podID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerID="7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a" exitCode=0 Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.955634 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr2kx" event={"ID":"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83","Type":"ContainerDied","Data":"7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a"} Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.955667 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pr2kx" event={"ID":"206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83","Type":"ContainerDied","Data":"8c9021667bf95fbba75b9f4a3edaab411508f551b6070f62aaf4b4cc13132553"} Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.955698 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pr2kx" Sep 30 18:06:08 crc kubenswrapper[4797]: I0930 18:06:08.978555 4797 scope.go:117] "RemoveContainer" containerID="e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.001843 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pr2kx"] Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.012015 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.014909 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.016329 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.016761 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pr2kx"] Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.019465 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.020687 4797 scope.go:117] "RemoveContainer" containerID="517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc" Sep 30 18:06:09 crc kubenswrapper[4797]: E0930 18:06:09.021217 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc\": container with ID starting with 517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc not found: ID does not exist" containerID="517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.021254 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc"} err="failed to get container status \"517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc\": rpc error: code = NotFound desc = could not find container \"517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc\": container with ID starting with 517b9e6f0f12bdf83a78d05a396f9abc86209b436c2b0d27be222736421c38bc not found: ID does not exist" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.021277 4797 scope.go:117] "RemoveContainer" containerID="e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08" Sep 30 18:06:09 crc kubenswrapper[4797]: E0930 18:06:09.021736 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08\": container with ID starting with e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08 not found: ID does not exist" containerID="e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.021757 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08"} err="failed to get container status \"e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08\": rpc error: code = NotFound desc = could not find container \"e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08\": container with ID starting with e9a9360f596243bd47f122fd1d65650d180ae12bb7be83f52f00f13b08e77e08 not found: ID does not exist" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.021771 4797 scope.go:117] "RemoveContainer" containerID="7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.022411 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-config" (OuterVolumeSpecName: "config") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.032591 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.032648 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.032663 4797 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.032676 4797 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.032708 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sq76\" (UniqueName: \"kubernetes.io/projected/d603263e-59e2-4643-9b6c-6c7a605c3226-kube-api-access-8sq76\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.032738 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.038892 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d603263e-59e2-4643-9b6c-6c7a605c3226" (UID: "d603263e-59e2-4643-9b6c-6c7a605c3226"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.046681 4797 scope.go:117] "RemoveContainer" containerID="d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.076395 4797 scope.go:117] "RemoveContainer" containerID="bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.137978 4797 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d603263e-59e2-4643-9b6c-6c7a605c3226-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.150956 4797 scope.go:117] "RemoveContainer" containerID="7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a" Sep 30 18:06:09 crc kubenswrapper[4797]: E0930 18:06:09.151410 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a\": container with ID starting with 7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a not found: ID does not exist" containerID="7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.151522 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a"} err="failed to get container status \"7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a\": rpc error: code = NotFound desc = could not find container \"7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a\": container with ID starting with 7d22e4f88810ab88500d449a5c179d952e24cd001e4a84a9766b58e241433c4a not found: ID does not exist" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.151551 4797 scope.go:117] "RemoveContainer" containerID="d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251" Sep 30 18:06:09 crc kubenswrapper[4797]: E0930 18:06:09.151838 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251\": container with ID starting with d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251 not found: ID does not exist" containerID="d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.151868 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251"} err="failed to get container status \"d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251\": rpc error: code = NotFound desc = could not find container \"d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251\": container with ID starting with d3d5e562bf69325a9dc44a6146c975a1a4e6bfd55486c0e7b43cbf8b49d7b251 not found: ID does not exist" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.151890 4797 scope.go:117] "RemoveContainer" containerID="bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878" Sep 30 18:06:09 crc kubenswrapper[4797]: E0930 18:06:09.152118 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878\": container with ID starting with bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878 not found: ID does not exist" containerID="bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.152146 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878"} err="failed to get container status \"bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878\": rpc error: code = NotFound desc = could not find container \"bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878\": container with ID starting with bf83a2424a1ebe5be41934674a8005c663de7c723e07d381940a5a4d50225878 not found: ID does not exist" Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.289702 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-djf8r"] Sep 30 18:06:09 crc kubenswrapper[4797]: I0930 18:06:09.299174 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-djf8r"] Sep 30 18:06:10 crc kubenswrapper[4797]: I0930 18:06:10.254380 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" path="/var/lib/kubelet/pods/206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83/volumes" Sep 30 18:06:10 crc kubenswrapper[4797]: I0930 18:06:10.256816 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" path="/var/lib/kubelet/pods/d603263e-59e2-4643-9b6c-6c7a605c3226/volumes" Sep 30 18:06:16 crc kubenswrapper[4797]: I0930 18:06:16.055999 4797 generic.go:334] "Generic (PLEG): container finished" podID="c7ffa7d5-6c5e-4d12-beb4-beca118f83d5" containerID="e4d87fc2751fdece573359ae8c7d445bbe48fab8c71176b239f2aecd2de85a46" exitCode=0 Sep 30 18:06:16 crc kubenswrapper[4797]: I0930 18:06:16.056129 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5","Type":"ContainerDied","Data":"e4d87fc2751fdece573359ae8c7d445bbe48fab8c71176b239f2aecd2de85a46"} Sep 30 18:06:17 crc kubenswrapper[4797]: I0930 18:06:17.070777 4797 generic.go:334] "Generic (PLEG): container finished" podID="399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d" containerID="108839811c0813b23880f17c942262cbc69345fcaae5a324ee412d1b5949a4ee" exitCode=0 Sep 30 18:06:17 crc kubenswrapper[4797]: I0930 18:06:17.070902 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d","Type":"ContainerDied","Data":"108839811c0813b23880f17c942262cbc69345fcaae5a324ee412d1b5949a4ee"} Sep 30 18:06:17 crc kubenswrapper[4797]: I0930 18:06:17.074637 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7ffa7d5-6c5e-4d12-beb4-beca118f83d5","Type":"ContainerStarted","Data":"cda8ff0603e9912d4f5eb1351a92bfc77dd36004ad59e3edc64ec707cce1ce39"} Sep 30 18:06:17 crc kubenswrapper[4797]: I0930 18:06:17.074991 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 18:06:17 crc kubenswrapper[4797]: I0930 18:06:17.164121 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.164095876 podStartE2EDuration="34.164095876s" podCreationTimestamp="2025-09-30 18:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:06:17.1587208 +0000 UTC m=+1427.681220038" watchObservedRunningTime="2025-09-30 18:06:17.164095876 +0000 UTC m=+1427.686595114" Sep 30 18:06:18 crc kubenswrapper[4797]: I0930 18:06:18.089472 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d","Type":"ContainerStarted","Data":"6dd5ded01b90206354e9624522f1dad7be64dc090887016b833d617305469381"} Sep 30 18:06:18 crc kubenswrapper[4797]: I0930 18:06:18.119132 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.119118023 podStartE2EDuration="34.119118023s" podCreationTimestamp="2025-09-30 18:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:06:18.116085691 +0000 UTC m=+1428.638584949" watchObservedRunningTime="2025-09-30 18:06:18.119118023 +0000 UTC m=+1428.641617261" Sep 30 18:06:25 crc kubenswrapper[4797]: I0930 18:06:25.020372 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.017988 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh"] Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018715 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerName="dnsmasq-dns" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018728 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerName="dnsmasq-dns" Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018740 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="extract-utilities" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018746 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="extract-utilities" Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018761 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="extract-content" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018767 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="extract-content" Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018777 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="registry-server" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018783 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="registry-server" Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018804 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerName="init" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018809 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerName="init" Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018822 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerName="init" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018827 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerName="init" Sep 30 18:06:27 crc kubenswrapper[4797]: E0930 18:06:27.018851 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerName="dnsmasq-dns" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.018856 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerName="dnsmasq-dns" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.019031 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1494eb69-c2b0-489c-a348-dfa4047c09db" containerName="dnsmasq-dns" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.019046 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="206c3fb2-ce8e-4c45-b8c8-e1a8f0bd5e83" containerName="registry-server" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.019070 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d603263e-59e2-4643-9b6c-6c7a605c3226" containerName="dnsmasq-dns" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.019707 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.022985 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.023016 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.023394 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.041954 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.049715 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh"] Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.137988 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.138036 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqnst\" (UniqueName: \"kubernetes.io/projected/bff66612-d0f8-4159-a096-478975f4d2e5-kube-api-access-lqnst\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.138411 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.138559 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.240587 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.240733 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.240903 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.240954 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqnst\" (UniqueName: \"kubernetes.io/projected/bff66612-d0f8-4159-a096-478975f4d2e5-kube-api-access-lqnst\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.248720 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.248934 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.253497 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.265628 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqnst\" (UniqueName: \"kubernetes.io/projected/bff66612-d0f8-4159-a096-478975f4d2e5-kube-api-access-lqnst\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.350686 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:27 crc kubenswrapper[4797]: I0930 18:06:27.935225 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh"] Sep 30 18:06:27 crc kubenswrapper[4797]: W0930 18:06:27.939132 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbff66612_d0f8_4159_a096_478975f4d2e5.slice/crio-21e71af06704782559f4a2946b75496a60e76d0613e94ec0059f8eec069819e7 WatchSource:0}: Error finding container 21e71af06704782559f4a2946b75496a60e76d0613e94ec0059f8eec069819e7: Status 404 returned error can't find the container with id 21e71af06704782559f4a2946b75496a60e76d0613e94ec0059f8eec069819e7 Sep 30 18:06:28 crc kubenswrapper[4797]: I0930 18:06:28.202182 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" event={"ID":"bff66612-d0f8-4159-a096-478975f4d2e5","Type":"ContainerStarted","Data":"21e71af06704782559f4a2946b75496a60e76d0613e94ec0059f8eec069819e7"} Sep 30 18:06:34 crc kubenswrapper[4797]: I0930 18:06:34.359668 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 18:06:35 crc kubenswrapper[4797]: I0930 18:06:35.022640 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 18:06:37 crc kubenswrapper[4797]: I0930 18:06:37.406375 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:06:38 crc kubenswrapper[4797]: I0930 18:06:38.354268 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" event={"ID":"bff66612-d0f8-4159-a096-478975f4d2e5","Type":"ContainerStarted","Data":"96f5158676e402ed0f897a8bd8696c4de2cfee2e15e10819ac05b6e4b3c25a2f"} Sep 30 18:06:38 crc kubenswrapper[4797]: I0930 18:06:38.389408 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" podStartSLOduration=2.930113942 podStartE2EDuration="12.389382903s" podCreationTimestamp="2025-09-30 18:06:26 +0000 UTC" firstStartedPulling="2025-09-30 18:06:27.941484517 +0000 UTC m=+1438.463983755" lastFinishedPulling="2025-09-30 18:06:37.400753468 +0000 UTC m=+1447.923252716" observedRunningTime="2025-09-30 18:06:38.374931889 +0000 UTC m=+1448.897431137" watchObservedRunningTime="2025-09-30 18:06:38.389382903 +0000 UTC m=+1448.911882181" Sep 30 18:06:49 crc kubenswrapper[4797]: I0930 18:06:49.507164 4797 generic.go:334] "Generic (PLEG): container finished" podID="bff66612-d0f8-4159-a096-478975f4d2e5" containerID="96f5158676e402ed0f897a8bd8696c4de2cfee2e15e10819ac05b6e4b3c25a2f" exitCode=0 Sep 30 18:06:49 crc kubenswrapper[4797]: I0930 18:06:49.507257 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" event={"ID":"bff66612-d0f8-4159-a096-478975f4d2e5","Type":"ContainerDied","Data":"96f5158676e402ed0f897a8bd8696c4de2cfee2e15e10819ac05b6e4b3c25a2f"} Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.047926 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.115134 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-inventory\") pod \"bff66612-d0f8-4159-a096-478975f4d2e5\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.115314 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-repo-setup-combined-ca-bundle\") pod \"bff66612-d0f8-4159-a096-478975f4d2e5\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.116311 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-ssh-key\") pod \"bff66612-d0f8-4159-a096-478975f4d2e5\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.116411 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqnst\" (UniqueName: \"kubernetes.io/projected/bff66612-d0f8-4159-a096-478975f4d2e5-kube-api-access-lqnst\") pod \"bff66612-d0f8-4159-a096-478975f4d2e5\" (UID: \"bff66612-d0f8-4159-a096-478975f4d2e5\") " Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.122934 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff66612-d0f8-4159-a096-478975f4d2e5-kube-api-access-lqnst" (OuterVolumeSpecName: "kube-api-access-lqnst") pod "bff66612-d0f8-4159-a096-478975f4d2e5" (UID: "bff66612-d0f8-4159-a096-478975f4d2e5"). InnerVolumeSpecName "kube-api-access-lqnst". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.126916 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bff66612-d0f8-4159-a096-478975f4d2e5" (UID: "bff66612-d0f8-4159-a096-478975f4d2e5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.149394 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bff66612-d0f8-4159-a096-478975f4d2e5" (UID: "bff66612-d0f8-4159-a096-478975f4d2e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.159019 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-inventory" (OuterVolumeSpecName: "inventory") pod "bff66612-d0f8-4159-a096-478975f4d2e5" (UID: "bff66612-d0f8-4159-a096-478975f4d2e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.218935 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqnst\" (UniqueName: \"kubernetes.io/projected/bff66612-d0f8-4159-a096-478975f4d2e5-kube-api-access-lqnst\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.218982 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.218997 4797 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.219011 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bff66612-d0f8-4159-a096-478975f4d2e5-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.538484 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" event={"ID":"bff66612-d0f8-4159-a096-478975f4d2e5","Type":"ContainerDied","Data":"21e71af06704782559f4a2946b75496a60e76d0613e94ec0059f8eec069819e7"} Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.538548 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e71af06704782559f4a2946b75496a60e76d0613e94ec0059f8eec069819e7" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.538560 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.665178 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2"] Sep 30 18:06:51 crc kubenswrapper[4797]: E0930 18:06:51.666078 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff66612-d0f8-4159-a096-478975f4d2e5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.666132 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff66612-d0f8-4159-a096-478975f4d2e5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.666689 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff66612-d0f8-4159-a096-478975f4d2e5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.668412 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.674065 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.674248 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.674455 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.681950 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.687131 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2"] Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.732887 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.733281 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj5dl\" (UniqueName: \"kubernetes.io/projected/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-kube-api-access-fj5dl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.733324 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.834942 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.835048 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj5dl\" (UniqueName: \"kubernetes.io/projected/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-kube-api-access-fj5dl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.835098 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.842700 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.846996 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:51 crc kubenswrapper[4797]: I0930 18:06:51.868981 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj5dl\" (UniqueName: \"kubernetes.io/projected/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-kube-api-access-fj5dl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-j5hr2\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:52 crc kubenswrapper[4797]: I0930 18:06:52.007829 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:52 crc kubenswrapper[4797]: I0930 18:06:52.660962 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2"] Sep 30 18:06:53 crc kubenswrapper[4797]: I0930 18:06:53.559941 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" event={"ID":"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f","Type":"ContainerStarted","Data":"16d9a3133f0b67b1276c0f75a1e642f92097cdb622818c62c27eb9d326696e45"} Sep 30 18:06:53 crc kubenswrapper[4797]: I0930 18:06:53.560693 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" event={"ID":"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f","Type":"ContainerStarted","Data":"0588fde9e27ea43a5d65d0e7020f7ea4ac6bcda1fa3e010696c4206ad102ca75"} Sep 30 18:06:53 crc kubenswrapper[4797]: I0930 18:06:53.605085 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" podStartSLOduration=2.155102445 podStartE2EDuration="2.605062727s" podCreationTimestamp="2025-09-30 18:06:51 +0000 UTC" firstStartedPulling="2025-09-30 18:06:52.663034544 +0000 UTC m=+1463.185533782" lastFinishedPulling="2025-09-30 18:06:53.112994806 +0000 UTC m=+1463.635494064" observedRunningTime="2025-09-30 18:06:53.592802102 +0000 UTC m=+1464.115301340" watchObservedRunningTime="2025-09-30 18:06:53.605062727 +0000 UTC m=+1464.127561975" Sep 30 18:06:56 crc kubenswrapper[4797]: I0930 18:06:56.592234 4797 generic.go:334] "Generic (PLEG): container finished" podID="d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" containerID="16d9a3133f0b67b1276c0f75a1e642f92097cdb622818c62c27eb9d326696e45" exitCode=0 Sep 30 18:06:56 crc kubenswrapper[4797]: I0930 18:06:56.592314 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" event={"ID":"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f","Type":"ContainerDied","Data":"16d9a3133f0b67b1276c0f75a1e642f92097cdb622818c62c27eb9d326696e45"} Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.083703 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.125611 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-ssh-key\") pod \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.125842 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj5dl\" (UniqueName: \"kubernetes.io/projected/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-kube-api-access-fj5dl\") pod \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.125942 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-inventory\") pod \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\" (UID: \"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f\") " Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.131721 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-kube-api-access-fj5dl" (OuterVolumeSpecName: "kube-api-access-fj5dl") pod "d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" (UID: "d8595051-9106-4b9a-bc5a-0a3e2e6ad11f"). InnerVolumeSpecName "kube-api-access-fj5dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.161097 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" (UID: "d8595051-9106-4b9a-bc5a-0a3e2e6ad11f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.161757 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-inventory" (OuterVolumeSpecName: "inventory") pod "d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" (UID: "d8595051-9106-4b9a-bc5a-0a3e2e6ad11f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.229739 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj5dl\" (UniqueName: \"kubernetes.io/projected/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-kube-api-access-fj5dl\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.229793 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.229811 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8595051-9106-4b9a-bc5a-0a3e2e6ad11f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.614851 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" event={"ID":"d8595051-9106-4b9a-bc5a-0a3e2e6ad11f","Type":"ContainerDied","Data":"0588fde9e27ea43a5d65d0e7020f7ea4ac6bcda1fa3e010696c4206ad102ca75"} Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.615151 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-j5hr2" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.615167 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0588fde9e27ea43a5d65d0e7020f7ea4ac6bcda1fa3e010696c4206ad102ca75" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.717604 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg"] Sep 30 18:06:58 crc kubenswrapper[4797]: E0930 18:06:58.718111 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.718132 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.718423 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8595051-9106-4b9a-bc5a-0a3e2e6ad11f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.719301 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.728380 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.728518 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.728714 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.728872 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.741102 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg"] Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.747915 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.747977 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.748002 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl72h\" (UniqueName: \"kubernetes.io/projected/76ed1105-fad0-4d4d-9039-06795b66a457-kube-api-access-xl72h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.748046 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.850035 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.850120 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.850161 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl72h\" (UniqueName: \"kubernetes.io/projected/76ed1105-fad0-4d4d-9039-06795b66a457-kube-api-access-xl72h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.850225 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.856509 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.856919 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.857110 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:58 crc kubenswrapper[4797]: I0930 18:06:58.875968 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl72h\" (UniqueName: \"kubernetes.io/projected/76ed1105-fad0-4d4d-9039-06795b66a457-kube-api-access-xl72h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:59 crc kubenswrapper[4797]: I0930 18:06:59.055826 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:06:59 crc kubenswrapper[4797]: I0930 18:06:59.688941 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg"] Sep 30 18:07:00 crc kubenswrapper[4797]: I0930 18:07:00.636876 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" event={"ID":"76ed1105-fad0-4d4d-9039-06795b66a457","Type":"ContainerStarted","Data":"8366b6b19892c178d71bef69f08436ea4a14f6d83aa1441b94be42ec99edf288"} Sep 30 18:07:01 crc kubenswrapper[4797]: I0930 18:07:01.648988 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" event={"ID":"76ed1105-fad0-4d4d-9039-06795b66a457","Type":"ContainerStarted","Data":"4bf3f55b36f4100c88b2bcfda147b4508ad152476d95ea2c4204d38ac37945d5"} Sep 30 18:07:01 crc kubenswrapper[4797]: I0930 18:07:01.676780 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" podStartSLOduration=2.983932205 podStartE2EDuration="3.676753575s" podCreationTimestamp="2025-09-30 18:06:58 +0000 UTC" firstStartedPulling="2025-09-30 18:06:59.715996406 +0000 UTC m=+1470.238495674" lastFinishedPulling="2025-09-30 18:07:00.408817806 +0000 UTC m=+1470.931317044" observedRunningTime="2025-09-30 18:07:01.668735116 +0000 UTC m=+1472.191234394" watchObservedRunningTime="2025-09-30 18:07:01.676753575 +0000 UTC m=+1472.199252853" Sep 30 18:07:13 crc kubenswrapper[4797]: I0930 18:07:13.377381 4797 scope.go:117] "RemoveContainer" containerID="1af04ae03e62803cc27062452272d9a85a40afa0b07cd13b3ea375ab204604d2" Sep 30 18:07:14 crc kubenswrapper[4797]: I0930 18:07:14.193281 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:07:14 crc kubenswrapper[4797]: I0930 18:07:14.193359 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:07:44 crc kubenswrapper[4797]: I0930 18:07:44.191545 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:07:44 crc kubenswrapper[4797]: I0930 18:07:44.192217 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.697499 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zprpz"] Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.702326 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.719859 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zprpz"] Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.768801 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-catalog-content\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.768897 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-utilities\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.768981 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxb4h\" (UniqueName: \"kubernetes.io/projected/ccbf9e96-fc42-444d-b66b-db58c50ef2db-kube-api-access-wxb4h\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.871353 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxb4h\" (UniqueName: \"kubernetes.io/projected/ccbf9e96-fc42-444d-b66b-db58c50ef2db-kube-api-access-wxb4h\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.871837 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-catalog-content\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.872710 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-catalog-content\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.872942 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-utilities\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.873607 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-utilities\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:50 crc kubenswrapper[4797]: I0930 18:07:50.905269 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxb4h\" (UniqueName: \"kubernetes.io/projected/ccbf9e96-fc42-444d-b66b-db58c50ef2db-kube-api-access-wxb4h\") pod \"redhat-marketplace-zprpz\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:51 crc kubenswrapper[4797]: I0930 18:07:51.054809 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:07:51 crc kubenswrapper[4797]: I0930 18:07:51.511160 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zprpz"] Sep 30 18:07:52 crc kubenswrapper[4797]: I0930 18:07:52.253218 4797 generic.go:334] "Generic (PLEG): container finished" podID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerID="8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b" exitCode=0 Sep 30 18:07:52 crc kubenswrapper[4797]: I0930 18:07:52.256217 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zprpz" event={"ID":"ccbf9e96-fc42-444d-b66b-db58c50ef2db","Type":"ContainerDied","Data":"8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b"} Sep 30 18:07:52 crc kubenswrapper[4797]: I0930 18:07:52.256249 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zprpz" event={"ID":"ccbf9e96-fc42-444d-b66b-db58c50ef2db","Type":"ContainerStarted","Data":"6b8a77e08fac3587f5077f51ce8a41746261e9e26fb69eb37f33a23641ab8c46"} Sep 30 18:07:54 crc kubenswrapper[4797]: I0930 18:07:54.291854 4797 generic.go:334] "Generic (PLEG): container finished" podID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerID="babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83" exitCode=0 Sep 30 18:07:54 crc kubenswrapper[4797]: I0930 18:07:54.291943 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zprpz" event={"ID":"ccbf9e96-fc42-444d-b66b-db58c50ef2db","Type":"ContainerDied","Data":"babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83"} Sep 30 18:07:55 crc kubenswrapper[4797]: I0930 18:07:55.309075 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zprpz" event={"ID":"ccbf9e96-fc42-444d-b66b-db58c50ef2db","Type":"ContainerStarted","Data":"eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5"} Sep 30 18:07:55 crc kubenswrapper[4797]: I0930 18:07:55.340585 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zprpz" podStartSLOduration=2.584620831 podStartE2EDuration="5.340565416s" podCreationTimestamp="2025-09-30 18:07:50 +0000 UTC" firstStartedPulling="2025-09-30 18:07:52.256530649 +0000 UTC m=+1522.779029897" lastFinishedPulling="2025-09-30 18:07:55.012475244 +0000 UTC m=+1525.534974482" observedRunningTime="2025-09-30 18:07:55.340082973 +0000 UTC m=+1525.862582241" watchObservedRunningTime="2025-09-30 18:07:55.340565416 +0000 UTC m=+1525.863064664" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.057268 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j9rlx"] Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.059574 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.077292 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j9rlx"] Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.219304 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7ww\" (UniqueName: \"kubernetes.io/projected/9485aab4-9c83-46e2-a815-e85c98ac9d39-kube-api-access-px7ww\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.219714 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-utilities\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.219819 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-catalog-content\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.321631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-catalog-content\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.321809 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px7ww\" (UniqueName: \"kubernetes.io/projected/9485aab4-9c83-46e2-a815-e85c98ac9d39-kube-api-access-px7ww\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.321829 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-utilities\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.322261 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-utilities\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.322523 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-catalog-content\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.374588 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7ww\" (UniqueName: \"kubernetes.io/projected/9485aab4-9c83-46e2-a815-e85c98ac9d39-kube-api-access-px7ww\") pod \"community-operators-j9rlx\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.422335 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:07:57 crc kubenswrapper[4797]: W0930 18:07:57.895909 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9485aab4_9c83_46e2_a815_e85c98ac9d39.slice/crio-56629f5967f17f9110bf1aa2527a16d364b232666ac9400bad0c5e6b77679460 WatchSource:0}: Error finding container 56629f5967f17f9110bf1aa2527a16d364b232666ac9400bad0c5e6b77679460: Status 404 returned error can't find the container with id 56629f5967f17f9110bf1aa2527a16d364b232666ac9400bad0c5e6b77679460 Sep 30 18:07:57 crc kubenswrapper[4797]: I0930 18:07:57.898318 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j9rlx"] Sep 30 18:07:58 crc kubenswrapper[4797]: I0930 18:07:58.375712 4797 generic.go:334] "Generic (PLEG): container finished" podID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerID="4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e" exitCode=0 Sep 30 18:07:58 crc kubenswrapper[4797]: I0930 18:07:58.375766 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerDied","Data":"4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e"} Sep 30 18:07:58 crc kubenswrapper[4797]: I0930 18:07:58.375821 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerStarted","Data":"56629f5967f17f9110bf1aa2527a16d364b232666ac9400bad0c5e6b77679460"} Sep 30 18:07:59 crc kubenswrapper[4797]: I0930 18:07:59.390825 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerStarted","Data":"63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765"} Sep 30 18:08:00 crc kubenswrapper[4797]: I0930 18:08:00.420477 4797 generic.go:334] "Generic (PLEG): container finished" podID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerID="63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765" exitCode=0 Sep 30 18:08:00 crc kubenswrapper[4797]: I0930 18:08:00.420560 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerDied","Data":"63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765"} Sep 30 18:08:01 crc kubenswrapper[4797]: I0930 18:08:01.056947 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:08:01 crc kubenswrapper[4797]: I0930 18:08:01.059723 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:08:01 crc kubenswrapper[4797]: I0930 18:08:01.127033 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:08:01 crc kubenswrapper[4797]: I0930 18:08:01.438796 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerStarted","Data":"55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218"} Sep 30 18:08:01 crc kubenswrapper[4797]: I0930 18:08:01.461789 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j9rlx" podStartSLOduration=1.997095941 podStartE2EDuration="4.461773928s" podCreationTimestamp="2025-09-30 18:07:57 +0000 UTC" firstStartedPulling="2025-09-30 18:07:58.377163577 +0000 UTC m=+1528.899662815" lastFinishedPulling="2025-09-30 18:08:00.841841564 +0000 UTC m=+1531.364340802" observedRunningTime="2025-09-30 18:08:01.4592887 +0000 UTC m=+1531.981787948" watchObservedRunningTime="2025-09-30 18:08:01.461773928 +0000 UTC m=+1531.984273166" Sep 30 18:08:01 crc kubenswrapper[4797]: I0930 18:08:01.489907 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:08:03 crc kubenswrapper[4797]: I0930 18:08:03.442476 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zprpz"] Sep 30 18:08:04 crc kubenswrapper[4797]: I0930 18:08:04.467968 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zprpz" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="registry-server" containerID="cri-o://eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5" gracePeriod=2 Sep 30 18:08:04 crc kubenswrapper[4797]: I0930 18:08:04.970357 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.084333 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxb4h\" (UniqueName: \"kubernetes.io/projected/ccbf9e96-fc42-444d-b66b-db58c50ef2db-kube-api-access-wxb4h\") pod \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.084506 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-utilities\") pod \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.084599 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-catalog-content\") pod \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\" (UID: \"ccbf9e96-fc42-444d-b66b-db58c50ef2db\") " Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.086374 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-utilities" (OuterVolumeSpecName: "utilities") pod "ccbf9e96-fc42-444d-b66b-db58c50ef2db" (UID: "ccbf9e96-fc42-444d-b66b-db58c50ef2db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.090674 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbf9e96-fc42-444d-b66b-db58c50ef2db-kube-api-access-wxb4h" (OuterVolumeSpecName: "kube-api-access-wxb4h") pod "ccbf9e96-fc42-444d-b66b-db58c50ef2db" (UID: "ccbf9e96-fc42-444d-b66b-db58c50ef2db"). InnerVolumeSpecName "kube-api-access-wxb4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.098054 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccbf9e96-fc42-444d-b66b-db58c50ef2db" (UID: "ccbf9e96-fc42-444d-b66b-db58c50ef2db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.187325 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxb4h\" (UniqueName: \"kubernetes.io/projected/ccbf9e96-fc42-444d-b66b-db58c50ef2db-kube-api-access-wxb4h\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.187368 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.187381 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbf9e96-fc42-444d-b66b-db58c50ef2db-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.498948 4797 generic.go:334] "Generic (PLEG): container finished" podID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerID="eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5" exitCode=0 Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.498988 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zprpz" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.499012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zprpz" event={"ID":"ccbf9e96-fc42-444d-b66b-db58c50ef2db","Type":"ContainerDied","Data":"eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5"} Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.499327 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zprpz" event={"ID":"ccbf9e96-fc42-444d-b66b-db58c50ef2db","Type":"ContainerDied","Data":"6b8a77e08fac3587f5077f51ce8a41746261e9e26fb69eb37f33a23641ab8c46"} Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.499384 4797 scope.go:117] "RemoveContainer" containerID="eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.541134 4797 scope.go:117] "RemoveContainer" containerID="babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.590703 4797 scope.go:117] "RemoveContainer" containerID="8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.591028 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zprpz"] Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.601781 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zprpz"] Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.665108 4797 scope.go:117] "RemoveContainer" containerID="eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5" Sep 30 18:08:05 crc kubenswrapper[4797]: E0930 18:08:05.666236 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5\": container with ID starting with eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5 not found: ID does not exist" containerID="eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.666302 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5"} err="failed to get container status \"eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5\": rpc error: code = NotFound desc = could not find container \"eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5\": container with ID starting with eb7946e3bd1d68dd7e688fd8e537684a4762d77dfb50d487b71cd282169e93f5 not found: ID does not exist" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.666340 4797 scope.go:117] "RemoveContainer" containerID="babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83" Sep 30 18:08:05 crc kubenswrapper[4797]: E0930 18:08:05.667468 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83\": container with ID starting with babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83 not found: ID does not exist" containerID="babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.667497 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83"} err="failed to get container status \"babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83\": rpc error: code = NotFound desc = could not find container \"babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83\": container with ID starting with babace20c161756db970ab16e6c81425a1eb153fa0593fc08eb06afe9ce5be83 not found: ID does not exist" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.667516 4797 scope.go:117] "RemoveContainer" containerID="8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b" Sep 30 18:08:05 crc kubenswrapper[4797]: E0930 18:08:05.668763 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b\": container with ID starting with 8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b not found: ID does not exist" containerID="8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b" Sep 30 18:08:05 crc kubenswrapper[4797]: I0930 18:08:05.668797 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b"} err="failed to get container status \"8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b\": rpc error: code = NotFound desc = could not find container \"8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b\": container with ID starting with 8248751acc6478ac3b505f3237555ee4e122cde760ad3515788fa3b1ab9dd50b not found: ID does not exist" Sep 30 18:08:06 crc kubenswrapper[4797]: I0930 18:08:06.268263 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" path="/var/lib/kubelet/pods/ccbf9e96-fc42-444d-b66b-db58c50ef2db/volumes" Sep 30 18:08:07 crc kubenswrapper[4797]: I0930 18:08:07.422715 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:08:07 crc kubenswrapper[4797]: I0930 18:08:07.423042 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:08:07 crc kubenswrapper[4797]: I0930 18:08:07.510497 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:08:07 crc kubenswrapper[4797]: I0930 18:08:07.597153 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:08:08 crc kubenswrapper[4797]: I0930 18:08:08.454323 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j9rlx"] Sep 30 18:08:09 crc kubenswrapper[4797]: I0930 18:08:09.574811 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j9rlx" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="registry-server" containerID="cri-o://55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218" gracePeriod=2 Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.156882 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.296895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-utilities\") pod \"9485aab4-9c83-46e2-a815-e85c98ac9d39\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.297044 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-catalog-content\") pod \"9485aab4-9c83-46e2-a815-e85c98ac9d39\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.297188 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7ww\" (UniqueName: \"kubernetes.io/projected/9485aab4-9c83-46e2-a815-e85c98ac9d39-kube-api-access-px7ww\") pod \"9485aab4-9c83-46e2-a815-e85c98ac9d39\" (UID: \"9485aab4-9c83-46e2-a815-e85c98ac9d39\") " Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.298835 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-utilities" (OuterVolumeSpecName: "utilities") pod "9485aab4-9c83-46e2-a815-e85c98ac9d39" (UID: "9485aab4-9c83-46e2-a815-e85c98ac9d39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.306505 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9485aab4-9c83-46e2-a815-e85c98ac9d39-kube-api-access-px7ww" (OuterVolumeSpecName: "kube-api-access-px7ww") pod "9485aab4-9c83-46e2-a815-e85c98ac9d39" (UID: "9485aab4-9c83-46e2-a815-e85c98ac9d39"). InnerVolumeSpecName "kube-api-access-px7ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.389335 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9485aab4-9c83-46e2-a815-e85c98ac9d39" (UID: "9485aab4-9c83-46e2-a815-e85c98ac9d39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.399892 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.399938 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9485aab4-9c83-46e2-a815-e85c98ac9d39-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.399963 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px7ww\" (UniqueName: \"kubernetes.io/projected/9485aab4-9c83-46e2-a815-e85c98ac9d39-kube-api-access-px7ww\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.590199 4797 generic.go:334] "Generic (PLEG): container finished" podID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerID="55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218" exitCode=0 Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.590267 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerDied","Data":"55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218"} Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.590302 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9rlx" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.590324 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9rlx" event={"ID":"9485aab4-9c83-46e2-a815-e85c98ac9d39","Type":"ContainerDied","Data":"56629f5967f17f9110bf1aa2527a16d364b232666ac9400bad0c5e6b77679460"} Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.590354 4797 scope.go:117] "RemoveContainer" containerID="55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.623144 4797 scope.go:117] "RemoveContainer" containerID="63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.653561 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j9rlx"] Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.672925 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j9rlx"] Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.674487 4797 scope.go:117] "RemoveContainer" containerID="4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.726145 4797 scope.go:117] "RemoveContainer" containerID="55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218" Sep 30 18:08:10 crc kubenswrapper[4797]: E0930 18:08:10.727042 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218\": container with ID starting with 55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218 not found: ID does not exist" containerID="55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.727108 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218"} err="failed to get container status \"55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218\": rpc error: code = NotFound desc = could not find container \"55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218\": container with ID starting with 55e576f7227e23d4d77c5f736bb0851445f546f9159158a1667277ede5281218 not found: ID does not exist" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.727151 4797 scope.go:117] "RemoveContainer" containerID="63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765" Sep 30 18:08:10 crc kubenswrapper[4797]: E0930 18:08:10.727995 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765\": container with ID starting with 63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765 not found: ID does not exist" containerID="63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.728045 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765"} err="failed to get container status \"63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765\": rpc error: code = NotFound desc = could not find container \"63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765\": container with ID starting with 63bdc275c8358d44463eed60df732c933a133a3e4a63d3e61ba60b04cb434765 not found: ID does not exist" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.728078 4797 scope.go:117] "RemoveContainer" containerID="4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e" Sep 30 18:08:10 crc kubenswrapper[4797]: E0930 18:08:10.728481 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e\": container with ID starting with 4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e not found: ID does not exist" containerID="4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e" Sep 30 18:08:10 crc kubenswrapper[4797]: I0930 18:08:10.728516 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e"} err="failed to get container status \"4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e\": rpc error: code = NotFound desc = could not find container \"4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e\": container with ID starting with 4728e2ed5940bd2af5f10d40064d9d92e73872a1ebdb62088bed7121d308434e not found: ID does not exist" Sep 30 18:08:12 crc kubenswrapper[4797]: I0930 18:08:12.254302 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" path="/var/lib/kubelet/pods/9485aab4-9c83-46e2-a815-e85c98ac9d39/volumes" Sep 30 18:08:13 crc kubenswrapper[4797]: I0930 18:08:13.489336 4797 scope.go:117] "RemoveContainer" containerID="ea62b865d31ec338be9c647e612fd8d84155210dcdd889557093246b4daca098" Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.191684 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.191733 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.191775 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.192662 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.192718 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" gracePeriod=600 Sep 30 18:08:14 crc kubenswrapper[4797]: E0930 18:08:14.323959 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.652065 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" exitCode=0 Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.652145 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b"} Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.652218 4797 scope.go:117] "RemoveContainer" containerID="2917b8990bf356e8e1ce6fe1b3ce1f29f0b790b2ad003c6a1e85f4a96a1de3ae" Sep 30 18:08:14 crc kubenswrapper[4797]: I0930 18:08:14.653317 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:08:14 crc kubenswrapper[4797]: E0930 18:08:14.653825 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.874996 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gd4rb"] Sep 30 18:08:24 crc kubenswrapper[4797]: E0930 18:08:24.876216 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="registry-server" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876234 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="registry-server" Sep 30 18:08:24 crc kubenswrapper[4797]: E0930 18:08:24.876252 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="extract-utilities" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876259 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="extract-utilities" Sep 30 18:08:24 crc kubenswrapper[4797]: E0930 18:08:24.876275 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="extract-content" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876284 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="extract-content" Sep 30 18:08:24 crc kubenswrapper[4797]: E0930 18:08:24.876308 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="registry-server" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876316 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="registry-server" Sep 30 18:08:24 crc kubenswrapper[4797]: E0930 18:08:24.876338 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="extract-utilities" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876346 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="extract-utilities" Sep 30 18:08:24 crc kubenswrapper[4797]: E0930 18:08:24.876361 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="extract-content" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876369 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="extract-content" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876665 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9485aab4-9c83-46e2-a815-e85c98ac9d39" containerName="registry-server" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.876686 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbf9e96-fc42-444d-b66b-db58c50ef2db" containerName="registry-server" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.883197 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:24 crc kubenswrapper[4797]: I0930 18:08:24.898405 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gd4rb"] Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.028182 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-utilities\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.028226 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vscq4\" (UniqueName: \"kubernetes.io/projected/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-kube-api-access-vscq4\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.028251 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-catalog-content\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.129836 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-utilities\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.129891 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vscq4\" (UniqueName: \"kubernetes.io/projected/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-kube-api-access-vscq4\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.129916 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-catalog-content\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.130625 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-catalog-content\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.130739 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-utilities\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.155355 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vscq4\" (UniqueName: \"kubernetes.io/projected/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-kube-api-access-vscq4\") pod \"certified-operators-gd4rb\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.213714 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.663112 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gd4rb"] Sep 30 18:08:25 crc kubenswrapper[4797]: I0930 18:08:25.800670 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerStarted","Data":"e0c22aac466d7ec157ba1380d6109f8f51a90c1e89d6c6dcd5192d38a92319e4"} Sep 30 18:08:26 crc kubenswrapper[4797]: I0930 18:08:26.815762 4797 generic.go:334] "Generic (PLEG): container finished" podID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerID="0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913" exitCode=0 Sep 30 18:08:26 crc kubenswrapper[4797]: I0930 18:08:26.815814 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerDied","Data":"0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913"} Sep 30 18:08:27 crc kubenswrapper[4797]: I0930 18:08:27.828908 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerStarted","Data":"d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78"} Sep 30 18:08:28 crc kubenswrapper[4797]: I0930 18:08:28.857202 4797 generic.go:334] "Generic (PLEG): container finished" podID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerID="d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78" exitCode=0 Sep 30 18:08:28 crc kubenswrapper[4797]: I0930 18:08:28.857290 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerDied","Data":"d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78"} Sep 30 18:08:29 crc kubenswrapper[4797]: I0930 18:08:29.871398 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerStarted","Data":"8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c"} Sep 30 18:08:29 crc kubenswrapper[4797]: I0930 18:08:29.897020 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gd4rb" podStartSLOduration=3.304159093 podStartE2EDuration="5.896999541s" podCreationTimestamp="2025-09-30 18:08:24 +0000 UTC" firstStartedPulling="2025-09-30 18:08:26.818901076 +0000 UTC m=+1557.341400334" lastFinishedPulling="2025-09-30 18:08:29.411741544 +0000 UTC m=+1559.934240782" observedRunningTime="2025-09-30 18:08:29.889380402 +0000 UTC m=+1560.411879650" watchObservedRunningTime="2025-09-30 18:08:29.896999541 +0000 UTC m=+1560.419498779" Sep 30 18:08:30 crc kubenswrapper[4797]: I0930 18:08:30.254600 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:08:30 crc kubenswrapper[4797]: E0930 18:08:30.255707 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:08:35 crc kubenswrapper[4797]: I0930 18:08:35.214643 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:35 crc kubenswrapper[4797]: I0930 18:08:35.215075 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:35 crc kubenswrapper[4797]: I0930 18:08:35.304019 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:36 crc kubenswrapper[4797]: I0930 18:08:36.022299 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:36 crc kubenswrapper[4797]: I0930 18:08:36.093832 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gd4rb"] Sep 30 18:08:37 crc kubenswrapper[4797]: I0930 18:08:37.967006 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gd4rb" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="registry-server" containerID="cri-o://8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c" gracePeriod=2 Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.560012 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.634416 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vscq4\" (UniqueName: \"kubernetes.io/projected/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-kube-api-access-vscq4\") pod \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.634628 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-utilities\") pod \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.634671 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-catalog-content\") pod \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\" (UID: \"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01\") " Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.636541 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-utilities" (OuterVolumeSpecName: "utilities") pod "c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" (UID: "c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.643575 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-kube-api-access-vscq4" (OuterVolumeSpecName: "kube-api-access-vscq4") pod "c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" (UID: "c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01"). InnerVolumeSpecName "kube-api-access-vscq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.736904 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vscq4\" (UniqueName: \"kubernetes.io/projected/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-kube-api-access-vscq4\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.736944 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.958754 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" (UID: "c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.981421 4797 generic.go:334] "Generic (PLEG): container finished" podID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerID="8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c" exitCode=0 Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.981561 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerDied","Data":"8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c"} Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.981620 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd4rb" Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.982808 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd4rb" event={"ID":"c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01","Type":"ContainerDied","Data":"e0c22aac466d7ec157ba1380d6109f8f51a90c1e89d6c6dcd5192d38a92319e4"} Sep 30 18:08:38 crc kubenswrapper[4797]: I0930 18:08:38.983075 4797 scope.go:117] "RemoveContainer" containerID="8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.019932 4797 scope.go:117] "RemoveContainer" containerID="d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.044036 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.048602 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gd4rb"] Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.049336 4797 scope.go:117] "RemoveContainer" containerID="0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.061821 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gd4rb"] Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.092803 4797 scope.go:117] "RemoveContainer" containerID="8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c" Sep 30 18:08:39 crc kubenswrapper[4797]: E0930 18:08:39.093175 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c\": container with ID starting with 8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c not found: ID does not exist" containerID="8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.093207 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c"} err="failed to get container status \"8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c\": rpc error: code = NotFound desc = could not find container \"8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c\": container with ID starting with 8082fbcb609c22b6ea8d66b46aaad54d6be3e7f6088a2e703fc89fd3b7b9988c not found: ID does not exist" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.093236 4797 scope.go:117] "RemoveContainer" containerID="d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78" Sep 30 18:08:39 crc kubenswrapper[4797]: E0930 18:08:39.093549 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78\": container with ID starting with d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78 not found: ID does not exist" containerID="d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.093595 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78"} err="failed to get container status \"d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78\": rpc error: code = NotFound desc = could not find container \"d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78\": container with ID starting with d47ee17333b5ac37db4e766ff3da54884175259a6c794502ceb1356f7711ab78 not found: ID does not exist" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.093653 4797 scope.go:117] "RemoveContainer" containerID="0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913" Sep 30 18:08:39 crc kubenswrapper[4797]: E0930 18:08:39.093927 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913\": container with ID starting with 0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913 not found: ID does not exist" containerID="0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913" Sep 30 18:08:39 crc kubenswrapper[4797]: I0930 18:08:39.093952 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913"} err="failed to get container status \"0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913\": rpc error: code = NotFound desc = could not find container \"0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913\": container with ID starting with 0edc1c896b2cef9fde0053c117a24fa3448c87fe180ee374050f0a671277e913 not found: ID does not exist" Sep 30 18:08:40 crc kubenswrapper[4797]: I0930 18:08:40.261510 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" path="/var/lib/kubelet/pods/c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01/volumes" Sep 30 18:08:44 crc kubenswrapper[4797]: I0930 18:08:44.238815 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:08:44 crc kubenswrapper[4797]: E0930 18:08:44.239629 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:08:58 crc kubenswrapper[4797]: I0930 18:08:58.238712 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:08:58 crc kubenswrapper[4797]: E0930 18:08:58.240364 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:09:12 crc kubenswrapper[4797]: I0930 18:09:12.237965 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:09:12 crc kubenswrapper[4797]: E0930 18:09:12.238881 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:09:13 crc kubenswrapper[4797]: I0930 18:09:13.615251 4797 scope.go:117] "RemoveContainer" containerID="16815b620a0c0bde09222225a3d16bf86d63b745e21f7334c4cf084e5dd9d911" Sep 30 18:09:26 crc kubenswrapper[4797]: I0930 18:09:26.238403 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:09:26 crc kubenswrapper[4797]: E0930 18:09:26.239568 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:09:37 crc kubenswrapper[4797]: I0930 18:09:37.238470 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:09:37 crc kubenswrapper[4797]: E0930 18:09:37.240124 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:09:51 crc kubenswrapper[4797]: I0930 18:09:51.238033 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:09:51 crc kubenswrapper[4797]: E0930 18:09:51.238904 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:10:02 crc kubenswrapper[4797]: I0930 18:10:02.245016 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:10:02 crc kubenswrapper[4797]: E0930 18:10:02.246248 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.735560 4797 scope.go:117] "RemoveContainer" containerID="6097f2da00d39afd813bfce60bba79e4998f0c7907c8e2fc299c038c31c18a5f" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.773610 4797 scope.go:117] "RemoveContainer" containerID="8ba8fbaf8a87414db99e619ed9e300ec9d78a3bf57971ae3ca9c2a1eeb3a13b9" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.823200 4797 scope.go:117] "RemoveContainer" containerID="592ba570993cef9acfa989375a15e368922a1c1d2f8bd4524b80f15a393657f9" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.884815 4797 scope.go:117] "RemoveContainer" containerID="5b9dd7fae86ef0be2206f53e3e8716549909735eafa07aa7396578a8b42d8812" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.916423 4797 scope.go:117] "RemoveContainer" containerID="1466891f6e258498ed63ff7c280a5a201496a1c5b34361a655ecbbedff8b7ae3" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.940630 4797 scope.go:117] "RemoveContainer" containerID="0b4a481a85d3c7ed8512a3acc645bf3694785efe9387e8b6adedb90a962f4cc1" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.962685 4797 scope.go:117] "RemoveContainer" containerID="710ec4d876a01c70f41d96c08edd8e40a6ca4aa51ca85b1b6c0bdac813c31535" Sep 30 18:10:13 crc kubenswrapper[4797]: I0930 18:10:13.993726 4797 scope.go:117] "RemoveContainer" containerID="ee29e6878947000c47f020f8ece4e901886e8a298913a397615a841d09e8da59" Sep 30 18:10:14 crc kubenswrapper[4797]: I0930 18:10:14.017662 4797 scope.go:117] "RemoveContainer" containerID="d8ba5334b065be7485c370f94caa6e088fe691fe068ffdd5a3ca62405baf8b7b" Sep 30 18:10:14 crc kubenswrapper[4797]: I0930 18:10:14.039072 4797 scope.go:117] "RemoveContainer" containerID="84fda0e5dc81e1865434ee46de854012c61b7e9abbcc06163aed5055efbca85a" Sep 30 18:10:17 crc kubenswrapper[4797]: I0930 18:10:17.238643 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:10:17 crc kubenswrapper[4797]: E0930 18:10:17.240280 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:10:18 crc kubenswrapper[4797]: I0930 18:10:18.053269 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-dfjp2"] Sep 30 18:10:18 crc kubenswrapper[4797]: I0930 18:10:18.063407 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-dfjp2"] Sep 30 18:10:18 crc kubenswrapper[4797]: I0930 18:10:18.252856 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe9cbac-3168-4887-81ea-51d04c2a70c8" path="/var/lib/kubelet/pods/efe9cbac-3168-4887-81ea-51d04c2a70c8/volumes" Sep 30 18:10:27 crc kubenswrapper[4797]: I0930 18:10:27.043877 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4r4kf"] Sep 30 18:10:27 crc kubenswrapper[4797]: I0930 18:10:27.052712 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4r4kf"] Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.041089 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6t2ts"] Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.050120 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2ktxf"] Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.057478 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6t2ts"] Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.103006 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2ktxf"] Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.254594 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c787cf9-c86e-47f2-a4a9-30c4edc09890" path="/var/lib/kubelet/pods/0c787cf9-c86e-47f2-a4a9-30c4edc09890/volumes" Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.258200 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3b6e16-37d0-4b8e-96da-1a4c21c30af6" path="/var/lib/kubelet/pods/4b3b6e16-37d0-4b8e-96da-1a4c21c30af6/volumes" Sep 30 18:10:28 crc kubenswrapper[4797]: I0930 18:10:28.259472 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61c85c6-fd1a-4e9d-884d-7793317d857a" path="/var/lib/kubelet/pods/e61c85c6-fd1a-4e9d-884d-7793317d857a/volumes" Sep 30 18:10:31 crc kubenswrapper[4797]: I0930 18:10:31.238256 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:10:31 crc kubenswrapper[4797]: E0930 18:10:31.239216 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:10:31 crc kubenswrapper[4797]: I0930 18:10:31.321177 4797 generic.go:334] "Generic (PLEG): container finished" podID="76ed1105-fad0-4d4d-9039-06795b66a457" containerID="4bf3f55b36f4100c88b2bcfda147b4508ad152476d95ea2c4204d38ac37945d5" exitCode=0 Sep 30 18:10:31 crc kubenswrapper[4797]: I0930 18:10:31.321227 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" event={"ID":"76ed1105-fad0-4d4d-9039-06795b66a457","Type":"ContainerDied","Data":"4bf3f55b36f4100c88b2bcfda147b4508ad152476d95ea2c4204d38ac37945d5"} Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.027315 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-fc85-account-create-48fpf"] Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.040183 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-fc85-account-create-48fpf"] Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.250494 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adb5919-faea-4783-819a-867e38b3d90f" path="/var/lib/kubelet/pods/6adb5919-faea-4783-819a-867e38b3d90f/volumes" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.771184 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.889366 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-bootstrap-combined-ca-bundle\") pod \"76ed1105-fad0-4d4d-9039-06795b66a457\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.889590 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-inventory\") pod \"76ed1105-fad0-4d4d-9039-06795b66a457\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.889640 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl72h\" (UniqueName: \"kubernetes.io/projected/76ed1105-fad0-4d4d-9039-06795b66a457-kube-api-access-xl72h\") pod \"76ed1105-fad0-4d4d-9039-06795b66a457\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.889667 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-ssh-key\") pod \"76ed1105-fad0-4d4d-9039-06795b66a457\" (UID: \"76ed1105-fad0-4d4d-9039-06795b66a457\") " Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.895752 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ed1105-fad0-4d4d-9039-06795b66a457-kube-api-access-xl72h" (OuterVolumeSpecName: "kube-api-access-xl72h") pod "76ed1105-fad0-4d4d-9039-06795b66a457" (UID: "76ed1105-fad0-4d4d-9039-06795b66a457"). InnerVolumeSpecName "kube-api-access-xl72h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.896635 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "76ed1105-fad0-4d4d-9039-06795b66a457" (UID: "76ed1105-fad0-4d4d-9039-06795b66a457"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.929663 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-inventory" (OuterVolumeSpecName: "inventory") pod "76ed1105-fad0-4d4d-9039-06795b66a457" (UID: "76ed1105-fad0-4d4d-9039-06795b66a457"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.931625 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "76ed1105-fad0-4d4d-9039-06795b66a457" (UID: "76ed1105-fad0-4d4d-9039-06795b66a457"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.992240 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.992574 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl72h\" (UniqueName: \"kubernetes.io/projected/76ed1105-fad0-4d4d-9039-06795b66a457-kube-api-access-xl72h\") on node \"crc\" DevicePath \"\"" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.992667 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:10:32 crc kubenswrapper[4797]: I0930 18:10:32.992744 4797 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ed1105-fad0-4d4d-9039-06795b66a457-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.343181 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" event={"ID":"76ed1105-fad0-4d4d-9039-06795b66a457","Type":"ContainerDied","Data":"8366b6b19892c178d71bef69f08436ea4a14f6d83aa1441b94be42ec99edf288"} Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.343228 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8366b6b19892c178d71bef69f08436ea4a14f6d83aa1441b94be42ec99edf288" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.343275 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.448979 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h"] Sep 30 18:10:33 crc kubenswrapper[4797]: E0930 18:10:33.449466 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ed1105-fad0-4d4d-9039-06795b66a457" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.449489 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ed1105-fad0-4d4d-9039-06795b66a457" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 18:10:33 crc kubenswrapper[4797]: E0930 18:10:33.449517 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="extract-utilities" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.449526 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="extract-utilities" Sep 30 18:10:33 crc kubenswrapper[4797]: E0930 18:10:33.449560 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="extract-content" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.449566 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="extract-content" Sep 30 18:10:33 crc kubenswrapper[4797]: E0930 18:10:33.449574 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="registry-server" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.449580 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="registry-server" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.449752 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b95b5a-fcb6-4e0b-9df1-5e99a2980b01" containerName="registry-server" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.449783 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ed1105-fad0-4d4d-9039-06795b66a457" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.450486 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.452918 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.453248 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.453491 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.453715 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.460888 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h"] Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.502568 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.502637 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.503032 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb55b\" (UniqueName: \"kubernetes.io/projected/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-kube-api-access-fb55b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.605188 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.605291 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.605449 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb55b\" (UniqueName: \"kubernetes.io/projected/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-kube-api-access-fb55b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.611244 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.612587 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.624566 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb55b\" (UniqueName: \"kubernetes.io/projected/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-kube-api-access-fb55b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:33 crc kubenswrapper[4797]: I0930 18:10:33.785062 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:10:34 crc kubenswrapper[4797]: I0930 18:10:34.312208 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h"] Sep 30 18:10:34 crc kubenswrapper[4797]: I0930 18:10:34.320039 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:10:34 crc kubenswrapper[4797]: I0930 18:10:34.358784 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" event={"ID":"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7","Type":"ContainerStarted","Data":"65daf900f7d1da46a37b406643d0b1851fbf2298004ecf1772640f2198ac860d"} Sep 30 18:10:35 crc kubenswrapper[4797]: I0930 18:10:35.389350 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" event={"ID":"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7","Type":"ContainerStarted","Data":"265068566a2828923aa6c01d67fef3f52830d25e345111f1d494a9456334df74"} Sep 30 18:10:35 crc kubenswrapper[4797]: I0930 18:10:35.425362 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" podStartSLOduration=1.938326219 podStartE2EDuration="2.425338554s" podCreationTimestamp="2025-09-30 18:10:33 +0000 UTC" firstStartedPulling="2025-09-30 18:10:34.319759074 +0000 UTC m=+1684.842258312" lastFinishedPulling="2025-09-30 18:10:34.806771409 +0000 UTC m=+1685.329270647" observedRunningTime="2025-09-30 18:10:35.412975255 +0000 UTC m=+1685.935474533" watchObservedRunningTime="2025-09-30 18:10:35.425338554 +0000 UTC m=+1685.947837812" Sep 30 18:10:36 crc kubenswrapper[4797]: I0930 18:10:36.039419 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7981-account-create-2vvns"] Sep 30 18:10:36 crc kubenswrapper[4797]: I0930 18:10:36.048419 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7981-account-create-2vvns"] Sep 30 18:10:36 crc kubenswrapper[4797]: I0930 18:10:36.255807 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedc91e0-c1c4-41b5-90f7-415b32cd9ca2" path="/var/lib/kubelet/pods/eedc91e0-c1c4-41b5-90f7-415b32cd9ca2/volumes" Sep 30 18:10:42 crc kubenswrapper[4797]: I0930 18:10:42.239051 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:10:42 crc kubenswrapper[4797]: E0930 18:10:42.240543 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:10:43 crc kubenswrapper[4797]: I0930 18:10:43.046405 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-96eb-account-create-5nvh4"] Sep 30 18:10:43 crc kubenswrapper[4797]: I0930 18:10:43.058270 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-96eb-account-create-5nvh4"] Sep 30 18:10:44 crc kubenswrapper[4797]: I0930 18:10:44.255045 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d827f7-8cc5-41fc-a453-880a27418b98" path="/var/lib/kubelet/pods/54d827f7-8cc5-41fc-a453-880a27418b98/volumes" Sep 30 18:10:46 crc kubenswrapper[4797]: I0930 18:10:46.048098 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e107-account-create-tcs6r"] Sep 30 18:10:46 crc kubenswrapper[4797]: I0930 18:10:46.067929 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e107-account-create-tcs6r"] Sep 30 18:10:46 crc kubenswrapper[4797]: I0930 18:10:46.251972 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f0af8c-9d47-4ab5-807d-7853398fcf78" path="/var/lib/kubelet/pods/b1f0af8c-9d47-4ab5-807d-7853398fcf78/volumes" Sep 30 18:10:54 crc kubenswrapper[4797]: I0930 18:10:54.071770 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-kmcqt"] Sep 30 18:10:54 crc kubenswrapper[4797]: I0930 18:10:54.087037 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kfmjd"] Sep 30 18:10:54 crc kubenswrapper[4797]: I0930 18:10:54.096069 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-kmcqt"] Sep 30 18:10:54 crc kubenswrapper[4797]: I0930 18:10:54.103590 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kfmjd"] Sep 30 18:10:54 crc kubenswrapper[4797]: I0930 18:10:54.261747 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df77de6-491d-4cbb-a8e4-ce74e0e99e9f" path="/var/lib/kubelet/pods/4df77de6-491d-4cbb-a8e4-ce74e0e99e9f/volumes" Sep 30 18:10:54 crc kubenswrapper[4797]: I0930 18:10:54.263202 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd66af3e-b824-414c-abd8-339c62a9e9e1" path="/var/lib/kubelet/pods/dd66af3e-b824-414c-abd8-339c62a9e9e1/volumes" Sep 30 18:10:57 crc kubenswrapper[4797]: I0930 18:10:57.039151 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fqpjz"] Sep 30 18:10:57 crc kubenswrapper[4797]: I0930 18:10:57.053317 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fqpjz"] Sep 30 18:10:57 crc kubenswrapper[4797]: I0930 18:10:57.241420 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:10:57 crc kubenswrapper[4797]: E0930 18:10:57.241987 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:10:58 crc kubenswrapper[4797]: I0930 18:10:58.252473 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a0c476-b584-4818-a7c0-a5da97aaf4df" path="/var/lib/kubelet/pods/f2a0c476-b584-4818-a7c0-a5da97aaf4df/volumes" Sep 30 18:11:08 crc kubenswrapper[4797]: I0930 18:11:08.241998 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:11:08 crc kubenswrapper[4797]: E0930 18:11:08.247893 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.164619 4797 scope.go:117] "RemoveContainer" containerID="15d0eb2786741206fc82ff61e17abb660bb8d2e41e57f0e72658349b2a957ad3" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.211796 4797 scope.go:117] "RemoveContainer" containerID="d336439f2abc7c706dbf6b4deba195f1732b63db8db4cf6d2ffa2ccd4eb61ff8" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.275761 4797 scope.go:117] "RemoveContainer" containerID="7a07c0a556d68c593cf31c83956e7f32fd3822c0fce425b020e522c3dd16ab26" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.298824 4797 scope.go:117] "RemoveContainer" containerID="0934b925c8771a692e7fa91a3c882d8aeeb188e15c2a26ad1b0b2f6da26823d6" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.349703 4797 scope.go:117] "RemoveContainer" containerID="18913c8dad6b4389db5c95b8c8d2061f872585bf65da39434d5a92d70cb9b35c" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.391503 4797 scope.go:117] "RemoveContainer" containerID="d95becb5ce012d2a876f6fe02de65b310ac3b6aedd147337d1387a2bfd04f8b5" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.440548 4797 scope.go:117] "RemoveContainer" containerID="08d22fc23d4e511563e7b9dd2ce31315f91fb29ec4579749adfbcc939316fd48" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.486880 4797 scope.go:117] "RemoveContainer" containerID="5555af5f93fa7e03416c149d5d3487373a712ed327c8aa9534586239983e04ab" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.516024 4797 scope.go:117] "RemoveContainer" containerID="17759ec73c8e06d52425478a32d5c9292e4a5f2df9f2b0e9cc066595b7f1bf37" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.539782 4797 scope.go:117] "RemoveContainer" containerID="83b26af432cda3863ffd7d53c2a3b673abd70ad500ede9638dc17372ac9581f5" Sep 30 18:11:14 crc kubenswrapper[4797]: I0930 18:11:14.562365 4797 scope.go:117] "RemoveContainer" containerID="7402d957f9883b0b8c6151beed01acebd3f907b6e33aad141ac646d4d1244c33" Sep 30 18:11:15 crc kubenswrapper[4797]: I0930 18:11:15.041495 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-de80-account-create-v87xg"] Sep 30 18:11:15 crc kubenswrapper[4797]: I0930 18:11:15.053413 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-932a-account-create-pq6wq"] Sep 30 18:11:15 crc kubenswrapper[4797]: I0930 18:11:15.063753 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-932a-account-create-pq6wq"] Sep 30 18:11:15 crc kubenswrapper[4797]: I0930 18:11:15.074012 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-de80-account-create-v87xg"] Sep 30 18:11:15 crc kubenswrapper[4797]: I0930 18:11:15.082274 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bbfa-account-create-gjc2s"] Sep 30 18:11:15 crc kubenswrapper[4797]: I0930 18:11:15.093341 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bbfa-account-create-gjc2s"] Sep 30 18:11:16 crc kubenswrapper[4797]: I0930 18:11:16.255894 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499f35f7-af0b-4713-858e-ec4123da64a0" path="/var/lib/kubelet/pods/499f35f7-af0b-4713-858e-ec4123da64a0/volumes" Sep 30 18:11:16 crc kubenswrapper[4797]: I0930 18:11:16.257827 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0627c8-4235-4c74-81eb-aef495551b9f" path="/var/lib/kubelet/pods/8f0627c8-4235-4c74-81eb-aef495551b9f/volumes" Sep 30 18:11:16 crc kubenswrapper[4797]: I0930 18:11:16.259159 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc55cc29-cee8-4470-a6f6-398ff183ba0b" path="/var/lib/kubelet/pods/cc55cc29-cee8-4470-a6f6-398ff183ba0b/volumes" Sep 30 18:11:17 crc kubenswrapper[4797]: I0930 18:11:17.052583 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-g9tqz"] Sep 30 18:11:17 crc kubenswrapper[4797]: I0930 18:11:17.082551 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-g9tqz"] Sep 30 18:11:18 crc kubenswrapper[4797]: I0930 18:11:18.253990 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f8ded7-5fab-43fb-8d0f-f514889b5640" path="/var/lib/kubelet/pods/d7f8ded7-5fab-43fb-8d0f-f514889b5640/volumes" Sep 30 18:11:21 crc kubenswrapper[4797]: I0930 18:11:21.237994 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:11:21 crc kubenswrapper[4797]: E0930 18:11:21.238563 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:11:25 crc kubenswrapper[4797]: I0930 18:11:25.065286 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jqh9l"] Sep 30 18:11:25 crc kubenswrapper[4797]: I0930 18:11:25.075705 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jqh9l"] Sep 30 18:11:26 crc kubenswrapper[4797]: I0930 18:11:26.257236 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34d2759-ae6e-43a3-8010-80596a570a37" path="/var/lib/kubelet/pods/f34d2759-ae6e-43a3-8010-80596a570a37/volumes" Sep 30 18:11:36 crc kubenswrapper[4797]: I0930 18:11:36.241177 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:11:36 crc kubenswrapper[4797]: E0930 18:11:36.242706 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:11:48 crc kubenswrapper[4797]: I0930 18:11:48.239850 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:11:48 crc kubenswrapper[4797]: E0930 18:11:48.240780 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:12:02 crc kubenswrapper[4797]: I0930 18:12:02.238817 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:12:02 crc kubenswrapper[4797]: E0930 18:12:02.240194 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:12:03 crc kubenswrapper[4797]: I0930 18:12:03.085884 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-f899g"] Sep 30 18:12:03 crc kubenswrapper[4797]: I0930 18:12:03.096611 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-f899g"] Sep 30 18:12:04 crc kubenswrapper[4797]: I0930 18:12:04.253004 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe16203f-60b5-483f-83b5-1d26b25292c9" path="/var/lib/kubelet/pods/fe16203f-60b5-483f-83b5-1d26b25292c9/volumes" Sep 30 18:12:05 crc kubenswrapper[4797]: I0930 18:12:05.040059 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vlvs2"] Sep 30 18:12:05 crc kubenswrapper[4797]: I0930 18:12:05.050347 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vlvs2"] Sep 30 18:12:06 crc kubenswrapper[4797]: I0930 18:12:06.253593 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2deac399-0305-404a-bf66-cc9d4e122a3a" path="/var/lib/kubelet/pods/2deac399-0305-404a-bf66-cc9d4e122a3a/volumes" Sep 30 18:12:13 crc kubenswrapper[4797]: I0930 18:12:13.043314 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2q6s8"] Sep 30 18:12:13 crc kubenswrapper[4797]: I0930 18:12:13.053935 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2q6s8"] Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.038686 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7w9jn"] Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.048122 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7w9jn"] Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.285922 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6162f46a-e25e-4bcf-8e84-77d28c565c1c" path="/var/lib/kubelet/pods/6162f46a-e25e-4bcf-8e84-77d28c565c1c/volumes" Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.290967 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b2a372-883f-4418-9939-6de336219cb8" path="/var/lib/kubelet/pods/e7b2a372-883f-4418-9939-6de336219cb8/volumes" Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.875405 4797 scope.go:117] "RemoveContainer" containerID="ee4184338f4202c69830b3b41d086b57863af8f284f4e6c7d49eac529777f16a" Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.932362 4797 scope.go:117] "RemoveContainer" containerID="0a3d51a5d0e7481e5aa6b162ab3b98975b6c21a692ec800cc4369e1149c4b43f" Sep 30 18:12:14 crc kubenswrapper[4797]: I0930 18:12:14.998045 4797 scope.go:117] "RemoveContainer" containerID="5cbf1fa59249f612abc40fd5a72b59c307cb46eaed7b67111ff0c7429004bf0c" Sep 30 18:12:15 crc kubenswrapper[4797]: I0930 18:12:15.032926 4797 scope.go:117] "RemoveContainer" containerID="902a7a18c877f848c42dc317847cb1ac3a746878bf1d1544040483c12ba8a9c1" Sep 30 18:12:15 crc kubenswrapper[4797]: I0930 18:12:15.084338 4797 scope.go:117] "RemoveContainer" containerID="ff921115c86c25979c1c63c0a2a2e99fbc994c692c94ee22efb61b0e7107cbcf" Sep 30 18:12:15 crc kubenswrapper[4797]: I0930 18:12:15.129165 4797 scope.go:117] "RemoveContainer" containerID="67a7976a1409b78457acdbd2edd3f78545964324b8aee1c4d4ea1dd1782c8b34" Sep 30 18:12:15 crc kubenswrapper[4797]: I0930 18:12:15.181614 4797 scope.go:117] "RemoveContainer" containerID="3af9863f921ac19ad985e6355e638a2adad2f8f630db7b65158976422722ce09" Sep 30 18:12:15 crc kubenswrapper[4797]: I0930 18:12:15.226043 4797 scope.go:117] "RemoveContainer" containerID="ca1dab9996455a1209064e4c2671e4c855cd56451110a56a64ad5befbc571c59" Sep 30 18:12:15 crc kubenswrapper[4797]: I0930 18:12:15.245322 4797 scope.go:117] "RemoveContainer" containerID="00b9cb38b97622f63cdb7a5d519afd3f6fe820c91a704879d41a4abb194269e0" Sep 30 18:12:16 crc kubenswrapper[4797]: I0930 18:12:16.239158 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:12:16 crc kubenswrapper[4797]: E0930 18:12:16.241129 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:12:29 crc kubenswrapper[4797]: I0930 18:12:29.238300 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:12:29 crc kubenswrapper[4797]: E0930 18:12:29.239100 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:12:29 crc kubenswrapper[4797]: I0930 18:12:29.825112 4797 generic.go:334] "Generic (PLEG): container finished" podID="deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" containerID="265068566a2828923aa6c01d67fef3f52830d25e345111f1d494a9456334df74" exitCode=0 Sep 30 18:12:29 crc kubenswrapper[4797]: I0930 18:12:29.825265 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" event={"ID":"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7","Type":"ContainerDied","Data":"265068566a2828923aa6c01d67fef3f52830d25e345111f1d494a9456334df74"} Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.404591 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.522186 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-inventory\") pod \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.522261 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb55b\" (UniqueName: \"kubernetes.io/projected/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-kube-api-access-fb55b\") pod \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.522382 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-ssh-key\") pod \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\" (UID: \"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7\") " Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.528020 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-kube-api-access-fb55b" (OuterVolumeSpecName: "kube-api-access-fb55b") pod "deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" (UID: "deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7"). InnerVolumeSpecName "kube-api-access-fb55b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.551319 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-inventory" (OuterVolumeSpecName: "inventory") pod "deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" (UID: "deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.557579 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" (UID: "deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.624650 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.624687 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.624699 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb55b\" (UniqueName: \"kubernetes.io/projected/deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7-kube-api-access-fb55b\") on node \"crc\" DevicePath \"\"" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.848495 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" event={"ID":"deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7","Type":"ContainerDied","Data":"65daf900f7d1da46a37b406643d0b1851fbf2298004ecf1772640f2198ac860d"} Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.848871 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65daf900f7d1da46a37b406643d0b1851fbf2298004ecf1772640f2198ac860d" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.849001 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.945341 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785"] Sep 30 18:12:31 crc kubenswrapper[4797]: E0930 18:12:31.945851 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.945872 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.946048 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.946766 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.951268 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.951296 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.951862 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.951965 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:12:31 crc kubenswrapper[4797]: I0930 18:12:31.971903 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785"] Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.033196 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wjb\" (UniqueName: \"kubernetes.io/projected/49ebc230-80f5-4bd0-a2fb-91cd9705a000-kube-api-access-86wjb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.033392 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.033471 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.135067 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wjb\" (UniqueName: \"kubernetes.io/projected/49ebc230-80f5-4bd0-a2fb-91cd9705a000-kube-api-access-86wjb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.135179 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.135219 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.145225 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.145252 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.162559 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wjb\" (UniqueName: \"kubernetes.io/projected/49ebc230-80f5-4bd0-a2fb-91cd9705a000-kube-api-access-86wjb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-k9785\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.276241 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.732418 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785"] Sep 30 18:12:32 crc kubenswrapper[4797]: I0930 18:12:32.860829 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" event={"ID":"49ebc230-80f5-4bd0-a2fb-91cd9705a000","Type":"ContainerStarted","Data":"0a4b04780bf5d6c4edda428e7f216d24471f78b6ace24e89818b4b2ef8e72783"} Sep 30 18:12:33 crc kubenswrapper[4797]: I0930 18:12:33.875405 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" event={"ID":"49ebc230-80f5-4bd0-a2fb-91cd9705a000","Type":"ContainerStarted","Data":"69d4fd737d37874008576de81f4e5ceec79e4d8f7fd4feb62d6b71fe0032e1ca"} Sep 30 18:12:33 crc kubenswrapper[4797]: I0930 18:12:33.893838 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" podStartSLOduration=2.372309768 podStartE2EDuration="2.89381541s" podCreationTimestamp="2025-09-30 18:12:31 +0000 UTC" firstStartedPulling="2025-09-30 18:12:32.736279314 +0000 UTC m=+1803.258778582" lastFinishedPulling="2025-09-30 18:12:33.257784946 +0000 UTC m=+1803.780284224" observedRunningTime="2025-09-30 18:12:33.891159847 +0000 UTC m=+1804.413659095" watchObservedRunningTime="2025-09-30 18:12:33.89381541 +0000 UTC m=+1804.416314658" Sep 30 18:12:42 crc kubenswrapper[4797]: I0930 18:12:42.064098 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hvw66"] Sep 30 18:12:42 crc kubenswrapper[4797]: I0930 18:12:42.071330 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hvw66"] Sep 30 18:12:42 crc kubenswrapper[4797]: I0930 18:12:42.258424 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ba253b-ce64-4926-a8d3-1c8dd9dfef16" path="/var/lib/kubelet/pods/81ba253b-ce64-4926-a8d3-1c8dd9dfef16/volumes" Sep 30 18:12:43 crc kubenswrapper[4797]: I0930 18:12:43.238499 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:12:43 crc kubenswrapper[4797]: E0930 18:12:43.239687 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:12:44 crc kubenswrapper[4797]: I0930 18:12:44.031491 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2zrb5"] Sep 30 18:12:44 crc kubenswrapper[4797]: I0930 18:12:44.042049 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2zrb5"] Sep 30 18:12:44 crc kubenswrapper[4797]: I0930 18:12:44.256748 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844" path="/var/lib/kubelet/pods/7cdfac79-5fa6-4d9b-99b7-9f3cf3f67844/volumes" Sep 30 18:12:56 crc kubenswrapper[4797]: I0930 18:12:56.238060 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:12:56 crc kubenswrapper[4797]: E0930 18:12:56.239065 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:13:10 crc kubenswrapper[4797]: I0930 18:13:10.245025 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:13:10 crc kubenswrapper[4797]: E0930 18:13:10.246004 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.049550 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mnf4c"] Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.059175 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5hxvf"] Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.067476 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-sqrmz"] Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.080507 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mnf4c"] Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.089108 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-sqrmz"] Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.097356 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5hxvf"] Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.475884 4797 scope.go:117] "RemoveContainer" containerID="03ccb9d672dd73c73fff5a145ef9ae0c478edaa773275a5b56ae9caa10998c44" Sep 30 18:13:15 crc kubenswrapper[4797]: I0930 18:13:15.534280 4797 scope.go:117] "RemoveContainer" containerID="2bbb0045459939763b9fc5c7e459682d43046ab67bce2a3ba42bab69bac711d2" Sep 30 18:13:16 crc kubenswrapper[4797]: I0930 18:13:16.249742 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92a15c8-7839-4cad-ab41-953e0d5c44f1" path="/var/lib/kubelet/pods/b92a15c8-7839-4cad-ab41-953e0d5c44f1/volumes" Sep 30 18:13:16 crc kubenswrapper[4797]: I0930 18:13:16.250315 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b9e58f-0903-462b-8e48-b8fbaca162d9" path="/var/lib/kubelet/pods/e0b9e58f-0903-462b-8e48-b8fbaca162d9/volumes" Sep 30 18:13:16 crc kubenswrapper[4797]: I0930 18:13:16.250823 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e516f41b-fb09-421f-9cb2-a10e1a24f02c" path="/var/lib/kubelet/pods/e516f41b-fb09-421f-9cb2-a10e1a24f02c/volumes" Sep 30 18:13:22 crc kubenswrapper[4797]: I0930 18:13:22.239063 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:13:23 crc kubenswrapper[4797]: I0930 18:13:23.468883 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"cf32d40964fc39c3364a5ea8bb7a60c96ff0ac48043bce05f91c1b37dde2e113"} Sep 30 18:13:25 crc kubenswrapper[4797]: I0930 18:13:25.057527 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6d23-account-create-fkt29"] Sep 30 18:13:25 crc kubenswrapper[4797]: I0930 18:13:25.067681 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5820-account-create-2xmfp"] Sep 30 18:13:25 crc kubenswrapper[4797]: I0930 18:13:25.079169 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7dbd-account-create-8qlmz"] Sep 30 18:13:25 crc kubenswrapper[4797]: I0930 18:13:25.092711 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6d23-account-create-fkt29"] Sep 30 18:13:25 crc kubenswrapper[4797]: I0930 18:13:25.101161 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5820-account-create-2xmfp"] Sep 30 18:13:25 crc kubenswrapper[4797]: I0930 18:13:25.108887 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7dbd-account-create-8qlmz"] Sep 30 18:13:26 crc kubenswrapper[4797]: I0930 18:13:26.249876 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6233c517-db8f-4d92-9d2a-486931d3cd14" path="/var/lib/kubelet/pods/6233c517-db8f-4d92-9d2a-486931d3cd14/volumes" Sep 30 18:13:26 crc kubenswrapper[4797]: I0930 18:13:26.250737 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6188720-6c09-49d3-b419-9b8de57dd718" path="/var/lib/kubelet/pods/a6188720-6c09-49d3-b419-9b8de57dd718/volumes" Sep 30 18:13:26 crc kubenswrapper[4797]: I0930 18:13:26.251213 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec964b14-f7af-4e27-9b92-6201023d7cc1" path="/var/lib/kubelet/pods/ec964b14-f7af-4e27-9b92-6201023d7cc1/volumes" Sep 30 18:13:50 crc kubenswrapper[4797]: I0930 18:13:50.759971 4797 generic.go:334] "Generic (PLEG): container finished" podID="49ebc230-80f5-4bd0-a2fb-91cd9705a000" containerID="69d4fd737d37874008576de81f4e5ceec79e4d8f7fd4feb62d6b71fe0032e1ca" exitCode=0 Sep 30 18:13:50 crc kubenswrapper[4797]: I0930 18:13:50.760076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" event={"ID":"49ebc230-80f5-4bd0-a2fb-91cd9705a000","Type":"ContainerDied","Data":"69d4fd737d37874008576de81f4e5ceec79e4d8f7fd4feb62d6b71fe0032e1ca"} Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.257798 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.391895 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-inventory\") pod \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.391997 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-ssh-key\") pod \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.392063 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86wjb\" (UniqueName: \"kubernetes.io/projected/49ebc230-80f5-4bd0-a2fb-91cd9705a000-kube-api-access-86wjb\") pod \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\" (UID: \"49ebc230-80f5-4bd0-a2fb-91cd9705a000\") " Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.406774 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ebc230-80f5-4bd0-a2fb-91cd9705a000-kube-api-access-86wjb" (OuterVolumeSpecName: "kube-api-access-86wjb") pod "49ebc230-80f5-4bd0-a2fb-91cd9705a000" (UID: "49ebc230-80f5-4bd0-a2fb-91cd9705a000"). InnerVolumeSpecName "kube-api-access-86wjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.446212 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49ebc230-80f5-4bd0-a2fb-91cd9705a000" (UID: "49ebc230-80f5-4bd0-a2fb-91cd9705a000"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.446522 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-inventory" (OuterVolumeSpecName: "inventory") pod "49ebc230-80f5-4bd0-a2fb-91cd9705a000" (UID: "49ebc230-80f5-4bd0-a2fb-91cd9705a000"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.495142 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.495321 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ebc230-80f5-4bd0-a2fb-91cd9705a000-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.495473 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86wjb\" (UniqueName: \"kubernetes.io/projected/49ebc230-80f5-4bd0-a2fb-91cd9705a000-kube-api-access-86wjb\") on node \"crc\" DevicePath \"\"" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.785704 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" event={"ID":"49ebc230-80f5-4bd0-a2fb-91cd9705a000","Type":"ContainerDied","Data":"0a4b04780bf5d6c4edda428e7f216d24471f78b6ace24e89818b4b2ef8e72783"} Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.786188 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4b04780bf5d6c4edda428e7f216d24471f78b6ace24e89818b4b2ef8e72783" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.785774 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-k9785" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.927841 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn"] Sep 30 18:13:52 crc kubenswrapper[4797]: E0930 18:13:52.928734 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ebc230-80f5-4bd0-a2fb-91cd9705a000" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.928761 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ebc230-80f5-4bd0-a2fb-91cd9705a000" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.929324 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ebc230-80f5-4bd0-a2fb-91cd9705a000" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.930544 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.934194 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.934893 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.934900 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.942814 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:13:52 crc kubenswrapper[4797]: I0930 18:13:52.963521 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn"] Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.108983 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.109091 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.109122 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fklm\" (UniqueName: \"kubernetes.io/projected/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-kube-api-access-5fklm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.210472 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.210561 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.210601 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fklm\" (UniqueName: \"kubernetes.io/projected/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-kube-api-access-5fklm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.214680 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.221105 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.226123 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fklm\" (UniqueName: \"kubernetes.io/projected/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-kube-api-access-5fklm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-fkswn\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.268187 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:13:53 crc kubenswrapper[4797]: W0930 18:13:53.837255 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14617993_cbb9_43c8_9ec5_d3a3afb1bc19.slice/crio-1543119bd4c06791eb28e969a679f13288a6805719416b9a33b103cf91ae9ad7 WatchSource:0}: Error finding container 1543119bd4c06791eb28e969a679f13288a6805719416b9a33b103cf91ae9ad7: Status 404 returned error can't find the container with id 1543119bd4c06791eb28e969a679f13288a6805719416b9a33b103cf91ae9ad7 Sep 30 18:13:53 crc kubenswrapper[4797]: I0930 18:13:53.837520 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn"] Sep 30 18:13:54 crc kubenswrapper[4797]: I0930 18:13:54.809348 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" event={"ID":"14617993-cbb9-43c8-9ec5-d3a3afb1bc19","Type":"ContainerStarted","Data":"1543119bd4c06791eb28e969a679f13288a6805719416b9a33b103cf91ae9ad7"} Sep 30 18:13:55 crc kubenswrapper[4797]: I0930 18:13:55.822855 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" event={"ID":"14617993-cbb9-43c8-9ec5-d3a3afb1bc19","Type":"ContainerStarted","Data":"8c044102fe3bae241b9abf675cc277aa57da377d3cde3a253cbf90709ccf5f9a"} Sep 30 18:13:55 crc kubenswrapper[4797]: I0930 18:13:55.841591 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" podStartSLOduration=3.124034666 podStartE2EDuration="3.841575303s" podCreationTimestamp="2025-09-30 18:13:52 +0000 UTC" firstStartedPulling="2025-09-30 18:13:53.842175696 +0000 UTC m=+1884.364674924" lastFinishedPulling="2025-09-30 18:13:54.559716293 +0000 UTC m=+1885.082215561" observedRunningTime="2025-09-30 18:13:55.838382156 +0000 UTC m=+1886.360881404" watchObservedRunningTime="2025-09-30 18:13:55.841575303 +0000 UTC m=+1886.364074541" Sep 30 18:13:56 crc kubenswrapper[4797]: I0930 18:13:56.047581 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ltgsf"] Sep 30 18:13:56 crc kubenswrapper[4797]: I0930 18:13:56.055751 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ltgsf"] Sep 30 18:13:56 crc kubenswrapper[4797]: I0930 18:13:56.252133 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90801fb-fe0a-4517-be34-b1ad52f0029e" path="/var/lib/kubelet/pods/d90801fb-fe0a-4517-be34-b1ad52f0029e/volumes" Sep 30 18:14:00 crc kubenswrapper[4797]: I0930 18:14:00.884966 4797 generic.go:334] "Generic (PLEG): container finished" podID="14617993-cbb9-43c8-9ec5-d3a3afb1bc19" containerID="8c044102fe3bae241b9abf675cc277aa57da377d3cde3a253cbf90709ccf5f9a" exitCode=0 Sep 30 18:14:00 crc kubenswrapper[4797]: I0930 18:14:00.885033 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" event={"ID":"14617993-cbb9-43c8-9ec5-d3a3afb1bc19","Type":"ContainerDied","Data":"8c044102fe3bae241b9abf675cc277aa57da377d3cde3a253cbf90709ccf5f9a"} Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.465670 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.613040 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-inventory\") pod \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.613235 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fklm\" (UniqueName: \"kubernetes.io/projected/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-kube-api-access-5fklm\") pod \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.613290 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-ssh-key\") pod \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\" (UID: \"14617993-cbb9-43c8-9ec5-d3a3afb1bc19\") " Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.619724 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-kube-api-access-5fklm" (OuterVolumeSpecName: "kube-api-access-5fklm") pod "14617993-cbb9-43c8-9ec5-d3a3afb1bc19" (UID: "14617993-cbb9-43c8-9ec5-d3a3afb1bc19"). InnerVolumeSpecName "kube-api-access-5fklm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.663826 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14617993-cbb9-43c8-9ec5-d3a3afb1bc19" (UID: "14617993-cbb9-43c8-9ec5-d3a3afb1bc19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.677289 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-inventory" (OuterVolumeSpecName: "inventory") pod "14617993-cbb9-43c8-9ec5-d3a3afb1bc19" (UID: "14617993-cbb9-43c8-9ec5-d3a3afb1bc19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.715663 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.715699 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fklm\" (UniqueName: \"kubernetes.io/projected/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-kube-api-access-5fklm\") on node \"crc\" DevicePath \"\"" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.715710 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14617993-cbb9-43c8-9ec5-d3a3afb1bc19-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.931854 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" event={"ID":"14617993-cbb9-43c8-9ec5-d3a3afb1bc19","Type":"ContainerDied","Data":"1543119bd4c06791eb28e969a679f13288a6805719416b9a33b103cf91ae9ad7"} Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.932144 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1543119bd4c06791eb28e969a679f13288a6805719416b9a33b103cf91ae9ad7" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.931916 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-fkswn" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.997417 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9"] Sep 30 18:14:02 crc kubenswrapper[4797]: E0930 18:14:02.997996 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14617993-cbb9-43c8-9ec5-d3a3afb1bc19" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.998024 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="14617993-cbb9-43c8-9ec5-d3a3afb1bc19" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.998272 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="14617993-cbb9-43c8-9ec5-d3a3afb1bc19" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 18:14:02 crc kubenswrapper[4797]: I0930 18:14:02.999277 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.001624 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.001852 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.006912 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.007949 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.008646 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9"] Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.124412 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.124574 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.124610 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9s4\" (UniqueName: \"kubernetes.io/projected/2ab411d2-db8f-47ff-9233-739acad6d3ee-kube-api-access-sb9s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.227597 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.227709 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.227734 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9s4\" (UniqueName: \"kubernetes.io/projected/2ab411d2-db8f-47ff-9233-739acad6d3ee-kube-api-access-sb9s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.233258 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.236139 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.259807 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9s4\" (UniqueName: \"kubernetes.io/projected/2ab411d2-db8f-47ff-9233-739acad6d3ee-kube-api-access-sb9s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-k5jg9\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.322187 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:03 crc kubenswrapper[4797]: I0930 18:14:03.955114 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9"] Sep 30 18:14:03 crc kubenswrapper[4797]: W0930 18:14:03.964918 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab411d2_db8f_47ff_9233_739acad6d3ee.slice/crio-7b712532a4c58910fbbd24b059a1be8a181579894a7b57dcbf2e41a3c4e84ca8 WatchSource:0}: Error finding container 7b712532a4c58910fbbd24b059a1be8a181579894a7b57dcbf2e41a3c4e84ca8: Status 404 returned error can't find the container with id 7b712532a4c58910fbbd24b059a1be8a181579894a7b57dcbf2e41a3c4e84ca8 Sep 30 18:14:04 crc kubenswrapper[4797]: I0930 18:14:04.954025 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" event={"ID":"2ab411d2-db8f-47ff-9233-739acad6d3ee","Type":"ContainerStarted","Data":"2e0ee58e50ad61877e0c7afd60dace6dc2c18cefb20b45072f490bc1efe33342"} Sep 30 18:14:04 crc kubenswrapper[4797]: I0930 18:14:04.954715 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" event={"ID":"2ab411d2-db8f-47ff-9233-739acad6d3ee","Type":"ContainerStarted","Data":"7b712532a4c58910fbbd24b059a1be8a181579894a7b57dcbf2e41a3c4e84ca8"} Sep 30 18:14:04 crc kubenswrapper[4797]: I0930 18:14:04.980104 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" podStartSLOduration=2.386449185 podStartE2EDuration="2.98008465s" podCreationTimestamp="2025-09-30 18:14:02 +0000 UTC" firstStartedPulling="2025-09-30 18:14:03.967348659 +0000 UTC m=+1894.489847897" lastFinishedPulling="2025-09-30 18:14:04.560984104 +0000 UTC m=+1895.083483362" observedRunningTime="2025-09-30 18:14:04.973966232 +0000 UTC m=+1895.496465480" watchObservedRunningTime="2025-09-30 18:14:04.98008465 +0000 UTC m=+1895.502583908" Sep 30 18:14:14 crc kubenswrapper[4797]: I0930 18:14:14.070185 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfh7r"] Sep 30 18:14:14 crc kubenswrapper[4797]: I0930 18:14:14.081182 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfh7r"] Sep 30 18:14:14 crc kubenswrapper[4797]: I0930 18:14:14.250604 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d1bea4-d095-4542-8a3d-7fdffe9ab4bb" path="/var/lib/kubelet/pods/81d1bea4-d095-4542-8a3d-7fdffe9ab4bb/volumes" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.638113 4797 scope.go:117] "RemoveContainer" containerID="3f24f030bf63434f9582048b18eea66b229cf0dba04997ed328d1958c7e7f19f" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.674645 4797 scope.go:117] "RemoveContainer" containerID="3ae8e0e237f66365ba92f165f32edfdaa8b8f7c8a6cc95abb1bcbb417675a776" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.727922 4797 scope.go:117] "RemoveContainer" containerID="9f5d0cbb47dfc1a5274fa316e9351d543fadcf3408574e33ffbafa96870294c8" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.771038 4797 scope.go:117] "RemoveContainer" containerID="a9632e2d8fcf7cca7c0136d063836f5c19209df1e5727ddb63c33ca9086a247c" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.844429 4797 scope.go:117] "RemoveContainer" containerID="325fb9a9a6a2712d0b38659c12486128d10aea0778f1f423920016f9da1f46e8" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.903045 4797 scope.go:117] "RemoveContainer" containerID="8dcf56f382679b43f160733ed83792467ab240c2bdc493a6fb819a33fa2d16df" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.936554 4797 scope.go:117] "RemoveContainer" containerID="83892b750d8911c3cd7169bbb2c7bb7a3a8e0b8cf09aecc950287771af0656e3" Sep 30 18:14:15 crc kubenswrapper[4797]: I0930 18:14:15.973579 4797 scope.go:117] "RemoveContainer" containerID="fcd6b1d83814a5dca340e0044d874c9c9722f62af998382cdaf15f3faeff0396" Sep 30 18:14:18 crc kubenswrapper[4797]: I0930 18:14:18.065199 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mltht"] Sep 30 18:14:18 crc kubenswrapper[4797]: I0930 18:14:18.079335 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mltht"] Sep 30 18:14:18 crc kubenswrapper[4797]: I0930 18:14:18.261250 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab000491-3207-401f-bd75-033e8b569622" path="/var/lib/kubelet/pods/ab000491-3207-401f-bd75-033e8b569622/volumes" Sep 30 18:14:47 crc kubenswrapper[4797]: I0930 18:14:47.468624 4797 generic.go:334] "Generic (PLEG): container finished" podID="2ab411d2-db8f-47ff-9233-739acad6d3ee" containerID="2e0ee58e50ad61877e0c7afd60dace6dc2c18cefb20b45072f490bc1efe33342" exitCode=0 Sep 30 18:14:47 crc kubenswrapper[4797]: I0930 18:14:47.468738 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" event={"ID":"2ab411d2-db8f-47ff-9233-739acad6d3ee","Type":"ContainerDied","Data":"2e0ee58e50ad61877e0c7afd60dace6dc2c18cefb20b45072f490bc1efe33342"} Sep 30 18:14:48 crc kubenswrapper[4797]: I0930 18:14:48.985648 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.187118 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-ssh-key\") pod \"2ab411d2-db8f-47ff-9233-739acad6d3ee\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.187377 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb9s4\" (UniqueName: \"kubernetes.io/projected/2ab411d2-db8f-47ff-9233-739acad6d3ee-kube-api-access-sb9s4\") pod \"2ab411d2-db8f-47ff-9233-739acad6d3ee\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.187413 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-inventory\") pod \"2ab411d2-db8f-47ff-9233-739acad6d3ee\" (UID: \"2ab411d2-db8f-47ff-9233-739acad6d3ee\") " Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.214941 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab411d2-db8f-47ff-9233-739acad6d3ee-kube-api-access-sb9s4" (OuterVolumeSpecName: "kube-api-access-sb9s4") pod "2ab411d2-db8f-47ff-9233-739acad6d3ee" (UID: "2ab411d2-db8f-47ff-9233-739acad6d3ee"). InnerVolumeSpecName "kube-api-access-sb9s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.252965 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2ab411d2-db8f-47ff-9233-739acad6d3ee" (UID: "2ab411d2-db8f-47ff-9233-739acad6d3ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.256815 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-inventory" (OuterVolumeSpecName: "inventory") pod "2ab411d2-db8f-47ff-9233-739acad6d3ee" (UID: "2ab411d2-db8f-47ff-9233-739acad6d3ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.291320 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.291354 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ab411d2-db8f-47ff-9233-739acad6d3ee-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.291365 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb9s4\" (UniqueName: \"kubernetes.io/projected/2ab411d2-db8f-47ff-9233-739acad6d3ee-kube-api-access-sb9s4\") on node \"crc\" DevicePath \"\"" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.494364 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" event={"ID":"2ab411d2-db8f-47ff-9233-739acad6d3ee","Type":"ContainerDied","Data":"7b712532a4c58910fbbd24b059a1be8a181579894a7b57dcbf2e41a3c4e84ca8"} Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.494470 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b712532a4c58910fbbd24b059a1be8a181579894a7b57dcbf2e41a3c4e84ca8" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.494520 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-k5jg9" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.619001 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct"] Sep 30 18:14:49 crc kubenswrapper[4797]: E0930 18:14:49.619711 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab411d2-db8f-47ff-9233-739acad6d3ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.619744 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab411d2-db8f-47ff-9233-739acad6d3ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.620194 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab411d2-db8f-47ff-9233-739acad6d3ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.625193 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.628138 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.629841 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.629844 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.630035 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.641890 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct"] Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.805231 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.805358 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qpl\" (UniqueName: \"kubernetes.io/projected/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-kube-api-access-n2qpl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.805512 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.908296 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.908429 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qpl\" (UniqueName: \"kubernetes.io/projected/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-kube-api-access-n2qpl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.908584 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.915861 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.918043 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.940592 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qpl\" (UniqueName: \"kubernetes.io/projected/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-kube-api-access-n2qpl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:49 crc kubenswrapper[4797]: I0930 18:14:49.951258 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:14:50 crc kubenswrapper[4797]: I0930 18:14:50.662478 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct"] Sep 30 18:14:51 crc kubenswrapper[4797]: I0930 18:14:51.522405 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" event={"ID":"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858","Type":"ContainerStarted","Data":"81d847dce28384c64a51593445ea99151b2422b78e14d9b8ab84e9e111b59029"} Sep 30 18:14:51 crc kubenswrapper[4797]: I0930 18:14:51.522868 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" event={"ID":"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858","Type":"ContainerStarted","Data":"693915ca021dcae61565adff85d5f8d5a4768ec3c47778b23db51f9498f304ae"} Sep 30 18:14:51 crc kubenswrapper[4797]: I0930 18:14:51.545497 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" podStartSLOduration=2.078273864 podStartE2EDuration="2.545481997s" podCreationTimestamp="2025-09-30 18:14:49 +0000 UTC" firstStartedPulling="2025-09-30 18:14:50.661931074 +0000 UTC m=+1941.184430352" lastFinishedPulling="2025-09-30 18:14:51.129139207 +0000 UTC m=+1941.651638485" observedRunningTime="2025-09-30 18:14:51.540200682 +0000 UTC m=+1942.062699920" watchObservedRunningTime="2025-09-30 18:14:51.545481997 +0000 UTC m=+1942.067981235" Sep 30 18:14:58 crc kubenswrapper[4797]: I0930 18:14:58.047370 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-v7dn7"] Sep 30 18:14:58 crc kubenswrapper[4797]: I0930 18:14:58.057865 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-v7dn7"] Sep 30 18:14:58 crc kubenswrapper[4797]: I0930 18:14:58.257980 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f803581d-a565-4443-8672-c57c027a3af5" path="/var/lib/kubelet/pods/f803581d-a565-4443-8672-c57c027a3af5/volumes" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.142630 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn"] Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.145844 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.148185 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.148752 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.167997 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn"] Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.247006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bckv\" (UniqueName: \"kubernetes.io/projected/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-kube-api-access-4bckv\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.247361 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-secret-volume\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.247504 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-config-volume\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.349337 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-secret-volume\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.349458 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-config-volume\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.349508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bckv\" (UniqueName: \"kubernetes.io/projected/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-kube-api-access-4bckv\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.351322 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-config-volume\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.364034 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-secret-volume\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.375793 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bckv\" (UniqueName: \"kubernetes.io/projected/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-kube-api-access-4bckv\") pod \"collect-profiles-29320935-vgssn\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.471784 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:00 crc kubenswrapper[4797]: I0930 18:15:00.977892 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn"] Sep 30 18:15:01 crc kubenswrapper[4797]: I0930 18:15:01.657956 4797 generic.go:334] "Generic (PLEG): container finished" podID="3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" containerID="a644d5555d4d432224f1c0c9fac352b7efd8235bd626931c3959ac1471e2fbb7" exitCode=0 Sep 30 18:15:01 crc kubenswrapper[4797]: I0930 18:15:01.658174 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" event={"ID":"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a","Type":"ContainerDied","Data":"a644d5555d4d432224f1c0c9fac352b7efd8235bd626931c3959ac1471e2fbb7"} Sep 30 18:15:01 crc kubenswrapper[4797]: I0930 18:15:01.658258 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" event={"ID":"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a","Type":"ContainerStarted","Data":"9298c40a1cafca6b542572095d39204ab5dc62195a8dfed20a0b6b71cf685599"} Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.068104 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.107362 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-secret-volume\") pod \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.107646 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bckv\" (UniqueName: \"kubernetes.io/projected/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-kube-api-access-4bckv\") pod \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.107872 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-config-volume\") pod \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\" (UID: \"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a\") " Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.109804 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" (UID: "3f7d13b2-d1cb-4b8c-b677-7aaae221e38a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.116757 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-kube-api-access-4bckv" (OuterVolumeSpecName: "kube-api-access-4bckv") pod "3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" (UID: "3f7d13b2-d1cb-4b8c-b677-7aaae221e38a"). InnerVolumeSpecName "kube-api-access-4bckv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.116887 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" (UID: "3f7d13b2-d1cb-4b8c-b677-7aaae221e38a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.210800 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.210859 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.210882 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bckv\" (UniqueName: \"kubernetes.io/projected/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a-kube-api-access-4bckv\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.679981 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" event={"ID":"3f7d13b2-d1cb-4b8c-b677-7aaae221e38a","Type":"ContainerDied","Data":"9298c40a1cafca6b542572095d39204ab5dc62195a8dfed20a0b6b71cf685599"} Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.680367 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9298c40a1cafca6b542572095d39204ab5dc62195a8dfed20a0b6b71cf685599" Sep 30 18:15:03 crc kubenswrapper[4797]: I0930 18:15:03.680017 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn" Sep 30 18:15:16 crc kubenswrapper[4797]: I0930 18:15:16.172793 4797 scope.go:117] "RemoveContainer" containerID="0f3d0d7fe75661e083dcc2e480d29f62590dc9195787680c6756b070718e39c9" Sep 30 18:15:16 crc kubenswrapper[4797]: I0930 18:15:16.238935 4797 scope.go:117] "RemoveContainer" containerID="36664ee1695bbc4f1051bc55f5404f12faafc9e456f39e5f5dbb71b3240d0162" Sep 30 18:15:44 crc kubenswrapper[4797]: I0930 18:15:44.191609 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:15:44 crc kubenswrapper[4797]: I0930 18:15:44.192082 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:15:53 crc kubenswrapper[4797]: I0930 18:15:53.235106 4797 generic.go:334] "Generic (PLEG): container finished" podID="994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" containerID="81d847dce28384c64a51593445ea99151b2422b78e14d9b8ab84e9e111b59029" exitCode=2 Sep 30 18:15:53 crc kubenswrapper[4797]: I0930 18:15:53.235239 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" event={"ID":"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858","Type":"ContainerDied","Data":"81d847dce28384c64a51593445ea99151b2422b78e14d9b8ab84e9e111b59029"} Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.632173 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.805875 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-ssh-key\") pod \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.805959 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2qpl\" (UniqueName: \"kubernetes.io/projected/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-kube-api-access-n2qpl\") pod \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.806139 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-inventory\") pod \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\" (UID: \"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858\") " Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.813674 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-kube-api-access-n2qpl" (OuterVolumeSpecName: "kube-api-access-n2qpl") pod "994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" (UID: "994f3ed9-cd04-48c1-a7ab-f0c3d08b5858"). InnerVolumeSpecName "kube-api-access-n2qpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.841463 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" (UID: "994f3ed9-cd04-48c1-a7ab-f0c3d08b5858"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.842193 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-inventory" (OuterVolumeSpecName: "inventory") pod "994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" (UID: "994f3ed9-cd04-48c1-a7ab-f0c3d08b5858"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.909902 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.909964 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2qpl\" (UniqueName: \"kubernetes.io/projected/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-kube-api-access-n2qpl\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:54 crc kubenswrapper[4797]: I0930 18:15:54.909987 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/994f3ed9-cd04-48c1-a7ab-f0c3d08b5858-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:15:55 crc kubenswrapper[4797]: I0930 18:15:55.269520 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" event={"ID":"994f3ed9-cd04-48c1-a7ab-f0c3d08b5858","Type":"ContainerDied","Data":"693915ca021dcae61565adff85d5f8d5a4768ec3c47778b23db51f9498f304ae"} Sep 30 18:15:55 crc kubenswrapper[4797]: I0930 18:15:55.269601 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="693915ca021dcae61565adff85d5f8d5a4768ec3c47778b23db51f9498f304ae" Sep 30 18:15:55 crc kubenswrapper[4797]: I0930 18:15:55.269683 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.033892 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj"] Sep 30 18:16:03 crc kubenswrapper[4797]: E0930 18:16:03.034802 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" containerName="collect-profiles" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.034818 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" containerName="collect-profiles" Sep 30 18:16:03 crc kubenswrapper[4797]: E0930 18:16:03.034852 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.034862 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.035067 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="994f3ed9-cd04-48c1-a7ab-f0c3d08b5858" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.035096 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" containerName="collect-profiles" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.035899 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.039753 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.040009 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.040787 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.041141 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.077269 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj"] Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.083006 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzf77\" (UniqueName: \"kubernetes.io/projected/d44c9877-2212-4102-8f03-d2cf682cf7b8-kube-api-access-lzf77\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.083148 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.083250 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.184787 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.184962 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzf77\" (UniqueName: \"kubernetes.io/projected/d44c9877-2212-4102-8f03-d2cf682cf7b8-kube-api-access-lzf77\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.185021 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.190781 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.199991 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.207175 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzf77\" (UniqueName: \"kubernetes.io/projected/d44c9877-2212-4102-8f03-d2cf682cf7b8-kube-api-access-lzf77\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-g45vj\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.386363 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.978765 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj"] Sep 30 18:16:03 crc kubenswrapper[4797]: I0930 18:16:03.994563 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:16:04 crc kubenswrapper[4797]: I0930 18:16:04.373285 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" event={"ID":"d44c9877-2212-4102-8f03-d2cf682cf7b8","Type":"ContainerStarted","Data":"2c9ce3c816788d3f0e16379ab363f3d6b03ed6290196114fdf113c703cc317f7"} Sep 30 18:16:05 crc kubenswrapper[4797]: I0930 18:16:05.391373 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" event={"ID":"d44c9877-2212-4102-8f03-d2cf682cf7b8","Type":"ContainerStarted","Data":"09468dcc268e8969da869cf9b603211e9dbe73fe5234e88b39d3fb545f490208"} Sep 30 18:16:05 crc kubenswrapper[4797]: I0930 18:16:05.415475 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" podStartSLOduration=1.962793394 podStartE2EDuration="2.415426498s" podCreationTimestamp="2025-09-30 18:16:03 +0000 UTC" firstStartedPulling="2025-09-30 18:16:03.994215563 +0000 UTC m=+2014.516714811" lastFinishedPulling="2025-09-30 18:16:04.446848677 +0000 UTC m=+2014.969347915" observedRunningTime="2025-09-30 18:16:05.413815984 +0000 UTC m=+2015.936315232" watchObservedRunningTime="2025-09-30 18:16:05.415426498 +0000 UTC m=+2015.937925756" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.191565 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.192259 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.533542 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fbqf"] Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.537580 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.578675 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fbqf"] Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.624199 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-catalog-content\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.624285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnq59\" (UniqueName: \"kubernetes.io/projected/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-kube-api-access-qnq59\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.624370 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-utilities\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.725661 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-utilities\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.725929 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-catalog-content\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.725993 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnq59\" (UniqueName: \"kubernetes.io/projected/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-kube-api-access-qnq59\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.726130 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-utilities\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.726354 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-catalog-content\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.752752 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnq59\" (UniqueName: \"kubernetes.io/projected/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-kube-api-access-qnq59\") pod \"redhat-operators-7fbqf\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:14 crc kubenswrapper[4797]: I0930 18:16:14.875678 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:15 crc kubenswrapper[4797]: I0930 18:16:15.356299 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fbqf"] Sep 30 18:16:15 crc kubenswrapper[4797]: I0930 18:16:15.503506 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerStarted","Data":"9670fd86a983b062cd54405853e16dc0b90e7c412c4a9660d309d4f72fd56cd5"} Sep 30 18:16:16 crc kubenswrapper[4797]: I0930 18:16:16.522878 4797 generic.go:334] "Generic (PLEG): container finished" podID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerID="1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100" exitCode=0 Sep 30 18:16:16 crc kubenswrapper[4797]: I0930 18:16:16.522962 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerDied","Data":"1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100"} Sep 30 18:16:18 crc kubenswrapper[4797]: I0930 18:16:18.558464 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerStarted","Data":"557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b"} Sep 30 18:16:20 crc kubenswrapper[4797]: I0930 18:16:20.579580 4797 generic.go:334] "Generic (PLEG): container finished" podID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerID="557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b" exitCode=0 Sep 30 18:16:20 crc kubenswrapper[4797]: I0930 18:16:20.579662 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerDied","Data":"557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b"} Sep 30 18:16:21 crc kubenswrapper[4797]: I0930 18:16:21.590980 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerStarted","Data":"67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555"} Sep 30 18:16:21 crc kubenswrapper[4797]: I0930 18:16:21.611335 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fbqf" podStartSLOduration=3.140159582 podStartE2EDuration="7.61131019s" podCreationTimestamp="2025-09-30 18:16:14 +0000 UTC" firstStartedPulling="2025-09-30 18:16:16.52692686 +0000 UTC m=+2027.049426108" lastFinishedPulling="2025-09-30 18:16:20.998077488 +0000 UTC m=+2031.520576716" observedRunningTime="2025-09-30 18:16:21.610924999 +0000 UTC m=+2032.133424237" watchObservedRunningTime="2025-09-30 18:16:21.61131019 +0000 UTC m=+2032.133809448" Sep 30 18:16:24 crc kubenswrapper[4797]: I0930 18:16:24.876787 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:24 crc kubenswrapper[4797]: I0930 18:16:24.877240 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:25 crc kubenswrapper[4797]: I0930 18:16:25.966548 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7fbqf" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="registry-server" probeResult="failure" output=< Sep 30 18:16:25 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 18:16:25 crc kubenswrapper[4797]: > Sep 30 18:16:34 crc kubenswrapper[4797]: I0930 18:16:34.952071 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:35 crc kubenswrapper[4797]: I0930 18:16:35.009427 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:35 crc kubenswrapper[4797]: I0930 18:16:35.209054 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fbqf"] Sep 30 18:16:36 crc kubenswrapper[4797]: I0930 18:16:36.763669 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7fbqf" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="registry-server" containerID="cri-o://67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555" gracePeriod=2 Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.283411 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.421868 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnq59\" (UniqueName: \"kubernetes.io/projected/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-kube-api-access-qnq59\") pod \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.422068 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-catalog-content\") pod \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.422203 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-utilities\") pod \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\" (UID: \"ca4664d5-0a68-4726-aea9-a13aebdfa4cb\") " Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.424200 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-utilities" (OuterVolumeSpecName: "utilities") pod "ca4664d5-0a68-4726-aea9-a13aebdfa4cb" (UID: "ca4664d5-0a68-4726-aea9-a13aebdfa4cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.429674 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-kube-api-access-qnq59" (OuterVolumeSpecName: "kube-api-access-qnq59") pod "ca4664d5-0a68-4726-aea9-a13aebdfa4cb" (UID: "ca4664d5-0a68-4726-aea9-a13aebdfa4cb"). InnerVolumeSpecName "kube-api-access-qnq59". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.524971 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.525022 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnq59\" (UniqueName: \"kubernetes.io/projected/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-kube-api-access-qnq59\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.552063 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca4664d5-0a68-4726-aea9-a13aebdfa4cb" (UID: "ca4664d5-0a68-4726-aea9-a13aebdfa4cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.627535 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4664d5-0a68-4726-aea9-a13aebdfa4cb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.776682 4797 generic.go:334] "Generic (PLEG): container finished" podID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerID="67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555" exitCode=0 Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.776725 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerDied","Data":"67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555"} Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.776757 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbqf" event={"ID":"ca4664d5-0a68-4726-aea9-a13aebdfa4cb","Type":"ContainerDied","Data":"9670fd86a983b062cd54405853e16dc0b90e7c412c4a9660d309d4f72fd56cd5"} Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.776773 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbqf" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.776778 4797 scope.go:117] "RemoveContainer" containerID="67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.805339 4797 scope.go:117] "RemoveContainer" containerID="557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.831840 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fbqf"] Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.842668 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7fbqf"] Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.842764 4797 scope.go:117] "RemoveContainer" containerID="1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.899050 4797 scope.go:117] "RemoveContainer" containerID="67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555" Sep 30 18:16:37 crc kubenswrapper[4797]: E0930 18:16:37.899503 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555\": container with ID starting with 67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555 not found: ID does not exist" containerID="67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.899541 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555"} err="failed to get container status \"67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555\": rpc error: code = NotFound desc = could not find container \"67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555\": container with ID starting with 67bd881650d890eb0f66d51adb25b63ce92971c29ad5a676970639ef0c01e555 not found: ID does not exist" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.899561 4797 scope.go:117] "RemoveContainer" containerID="557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b" Sep 30 18:16:37 crc kubenswrapper[4797]: E0930 18:16:37.899799 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b\": container with ID starting with 557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b not found: ID does not exist" containerID="557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.899882 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b"} err="failed to get container status \"557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b\": rpc error: code = NotFound desc = could not find container \"557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b\": container with ID starting with 557b79b2d7b5dcbf552431b3936a99156bd54517760441e46fdf5f57ad4ef23b not found: ID does not exist" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.899944 4797 scope.go:117] "RemoveContainer" containerID="1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100" Sep 30 18:16:37 crc kubenswrapper[4797]: E0930 18:16:37.900338 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100\": container with ID starting with 1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100 not found: ID does not exist" containerID="1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100" Sep 30 18:16:37 crc kubenswrapper[4797]: I0930 18:16:37.900365 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100"} err="failed to get container status \"1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100\": rpc error: code = NotFound desc = could not find container \"1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100\": container with ID starting with 1b6932d13673fb3c71fddd7d44ffa115cf97bc5008b043e6b142896b87cb7100 not found: ID does not exist" Sep 30 18:16:38 crc kubenswrapper[4797]: I0930 18:16:38.255148 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" path="/var/lib/kubelet/pods/ca4664d5-0a68-4726-aea9-a13aebdfa4cb/volumes" Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.192217 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.192739 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.192801 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.194009 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf32d40964fc39c3364a5ea8bb7a60c96ff0ac48043bce05f91c1b37dde2e113"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.194118 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://cf32d40964fc39c3364a5ea8bb7a60c96ff0ac48043bce05f91c1b37dde2e113" gracePeriod=600 Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.865601 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="cf32d40964fc39c3364a5ea8bb7a60c96ff0ac48043bce05f91c1b37dde2e113" exitCode=0 Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.865708 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"cf32d40964fc39c3364a5ea8bb7a60c96ff0ac48043bce05f91c1b37dde2e113"} Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.866479 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18"} Sep 30 18:16:44 crc kubenswrapper[4797]: I0930 18:16:44.866528 4797 scope.go:117] "RemoveContainer" containerID="389082c0daef4c623c65b85646987836a2958ef15e2bfc10a8f56c20f0053f5b" Sep 30 18:16:58 crc kubenswrapper[4797]: I0930 18:16:58.008048 4797 generic.go:334] "Generic (PLEG): container finished" podID="d44c9877-2212-4102-8f03-d2cf682cf7b8" containerID="09468dcc268e8969da869cf9b603211e9dbe73fe5234e88b39d3fb545f490208" exitCode=0 Sep 30 18:16:58 crc kubenswrapper[4797]: I0930 18:16:58.008092 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" event={"ID":"d44c9877-2212-4102-8f03-d2cf682cf7b8","Type":"ContainerDied","Data":"09468dcc268e8969da869cf9b603211e9dbe73fe5234e88b39d3fb545f490208"} Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.533884 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.587716 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf77\" (UniqueName: \"kubernetes.io/projected/d44c9877-2212-4102-8f03-d2cf682cf7b8-kube-api-access-lzf77\") pod \"d44c9877-2212-4102-8f03-d2cf682cf7b8\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.588813 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-inventory\") pod \"d44c9877-2212-4102-8f03-d2cf682cf7b8\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.588910 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-ssh-key\") pod \"d44c9877-2212-4102-8f03-d2cf682cf7b8\" (UID: \"d44c9877-2212-4102-8f03-d2cf682cf7b8\") " Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.595591 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d44c9877-2212-4102-8f03-d2cf682cf7b8-kube-api-access-lzf77" (OuterVolumeSpecName: "kube-api-access-lzf77") pod "d44c9877-2212-4102-8f03-d2cf682cf7b8" (UID: "d44c9877-2212-4102-8f03-d2cf682cf7b8"). InnerVolumeSpecName "kube-api-access-lzf77". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.615231 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d44c9877-2212-4102-8f03-d2cf682cf7b8" (UID: "d44c9877-2212-4102-8f03-d2cf682cf7b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.620679 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-inventory" (OuterVolumeSpecName: "inventory") pod "d44c9877-2212-4102-8f03-d2cf682cf7b8" (UID: "d44c9877-2212-4102-8f03-d2cf682cf7b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.691217 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf77\" (UniqueName: \"kubernetes.io/projected/d44c9877-2212-4102-8f03-d2cf682cf7b8-kube-api-access-lzf77\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.691244 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:16:59 crc kubenswrapper[4797]: I0930 18:16:59.691253 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d44c9877-2212-4102-8f03-d2cf682cf7b8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.031361 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" event={"ID":"d44c9877-2212-4102-8f03-d2cf682cf7b8","Type":"ContainerDied","Data":"2c9ce3c816788d3f0e16379ab363f3d6b03ed6290196114fdf113c703cc317f7"} Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.031526 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9ce3c816788d3f0e16379ab363f3d6b03ed6290196114fdf113c703cc317f7" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.031879 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-g45vj" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.148550 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rwq96"] Sep 30 18:17:00 crc kubenswrapper[4797]: E0930 18:17:00.148968 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44c9877-2212-4102-8f03-d2cf682cf7b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.148988 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44c9877-2212-4102-8f03-d2cf682cf7b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:00 crc kubenswrapper[4797]: E0930 18:17:00.149019 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="extract-utilities" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.149027 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="extract-utilities" Sep 30 18:17:00 crc kubenswrapper[4797]: E0930 18:17:00.149038 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="registry-server" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.149044 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="registry-server" Sep 30 18:17:00 crc kubenswrapper[4797]: E0930 18:17:00.149070 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="extract-content" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.149077 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="extract-content" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.149256 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d44c9877-2212-4102-8f03-d2cf682cf7b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.149277 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4664d5-0a68-4726-aea9-a13aebdfa4cb" containerName="registry-server" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.149995 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.152864 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.155255 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.155365 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.155689 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.171379 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rwq96"] Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.203816 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.203943 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.204043 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkdxd\" (UniqueName: \"kubernetes.io/projected/1976056c-3312-40a7-b5ae-f287c229e0a3-kube-api-access-jkdxd\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.305468 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.305592 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkdxd\" (UniqueName: \"kubernetes.io/projected/1976056c-3312-40a7-b5ae-f287c229e0a3-kube-api-access-jkdxd\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.305937 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.312849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.313476 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.326068 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkdxd\" (UniqueName: \"kubernetes.io/projected/1976056c-3312-40a7-b5ae-f287c229e0a3-kube-api-access-jkdxd\") pod \"ssh-known-hosts-edpm-deployment-rwq96\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:00 crc kubenswrapper[4797]: I0930 18:17:00.471360 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:01 crc kubenswrapper[4797]: I0930 18:17:01.044635 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rwq96"] Sep 30 18:17:01 crc kubenswrapper[4797]: W0930 18:17:01.050620 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1976056c_3312_40a7_b5ae_f287c229e0a3.slice/crio-d5509fd8695feb450ac2a096ef230c002a396a03dff349a477859572e1f78994 WatchSource:0}: Error finding container d5509fd8695feb450ac2a096ef230c002a396a03dff349a477859572e1f78994: Status 404 returned error can't find the container with id d5509fd8695feb450ac2a096ef230c002a396a03dff349a477859572e1f78994 Sep 30 18:17:02 crc kubenswrapper[4797]: I0930 18:17:02.063605 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" event={"ID":"1976056c-3312-40a7-b5ae-f287c229e0a3","Type":"ContainerStarted","Data":"fff6c56ac339027c866d4347e7670ccb0b8e24b858174382d4d8b00e7497cbc8"} Sep 30 18:17:02 crc kubenswrapper[4797]: I0930 18:17:02.064790 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" event={"ID":"1976056c-3312-40a7-b5ae-f287c229e0a3","Type":"ContainerStarted","Data":"d5509fd8695feb450ac2a096ef230c002a396a03dff349a477859572e1f78994"} Sep 30 18:17:02 crc kubenswrapper[4797]: I0930 18:17:02.086114 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" podStartSLOduration=1.485771528 podStartE2EDuration="2.086091026s" podCreationTimestamp="2025-09-30 18:17:00 +0000 UTC" firstStartedPulling="2025-09-30 18:17:01.053508882 +0000 UTC m=+2071.576008120" lastFinishedPulling="2025-09-30 18:17:01.65382834 +0000 UTC m=+2072.176327618" observedRunningTime="2025-09-30 18:17:02.077718827 +0000 UTC m=+2072.600218105" watchObservedRunningTime="2025-09-30 18:17:02.086091026 +0000 UTC m=+2072.608590274" Sep 30 18:17:10 crc kubenswrapper[4797]: I0930 18:17:10.150831 4797 generic.go:334] "Generic (PLEG): container finished" podID="1976056c-3312-40a7-b5ae-f287c229e0a3" containerID="fff6c56ac339027c866d4347e7670ccb0b8e24b858174382d4d8b00e7497cbc8" exitCode=0 Sep 30 18:17:10 crc kubenswrapper[4797]: I0930 18:17:10.150901 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" event={"ID":"1976056c-3312-40a7-b5ae-f287c229e0a3","Type":"ContainerDied","Data":"fff6c56ac339027c866d4347e7670ccb0b8e24b858174382d4d8b00e7497cbc8"} Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.642425 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.746859 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkdxd\" (UniqueName: \"kubernetes.io/projected/1976056c-3312-40a7-b5ae-f287c229e0a3-kube-api-access-jkdxd\") pod \"1976056c-3312-40a7-b5ae-f287c229e0a3\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.746956 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-ssh-key-openstack-edpm-ipam\") pod \"1976056c-3312-40a7-b5ae-f287c229e0a3\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.747065 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-inventory-0\") pod \"1976056c-3312-40a7-b5ae-f287c229e0a3\" (UID: \"1976056c-3312-40a7-b5ae-f287c229e0a3\") " Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.759242 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1976056c-3312-40a7-b5ae-f287c229e0a3-kube-api-access-jkdxd" (OuterVolumeSpecName: "kube-api-access-jkdxd") pod "1976056c-3312-40a7-b5ae-f287c229e0a3" (UID: "1976056c-3312-40a7-b5ae-f287c229e0a3"). InnerVolumeSpecName "kube-api-access-jkdxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.780191 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1976056c-3312-40a7-b5ae-f287c229e0a3" (UID: "1976056c-3312-40a7-b5ae-f287c229e0a3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.785788 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "1976056c-3312-40a7-b5ae-f287c229e0a3" (UID: "1976056c-3312-40a7-b5ae-f287c229e0a3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.849599 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.849642 4797 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1976056c-3312-40a7-b5ae-f287c229e0a3-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:11 crc kubenswrapper[4797]: I0930 18:17:11.849656 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkdxd\" (UniqueName: \"kubernetes.io/projected/1976056c-3312-40a7-b5ae-f287c229e0a3-kube-api-access-jkdxd\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.176356 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" event={"ID":"1976056c-3312-40a7-b5ae-f287c229e0a3","Type":"ContainerDied","Data":"d5509fd8695feb450ac2a096ef230c002a396a03dff349a477859572e1f78994"} Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.176413 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5509fd8695feb450ac2a096ef230c002a396a03dff349a477859572e1f78994" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.176465 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rwq96" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.271133 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd"] Sep 30 18:17:12 crc kubenswrapper[4797]: E0930 18:17:12.271832 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1976056c-3312-40a7-b5ae-f287c229e0a3" containerName="ssh-known-hosts-edpm-deployment" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.271856 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1976056c-3312-40a7-b5ae-f287c229e0a3" containerName="ssh-known-hosts-edpm-deployment" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.272107 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1976056c-3312-40a7-b5ae-f287c229e0a3" containerName="ssh-known-hosts-edpm-deployment" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.274935 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.297501 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd"] Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.302778 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.302927 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.303158 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.303294 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.362163 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.362263 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.362365 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9278s\" (UniqueName: \"kubernetes.io/projected/9b428893-75cc-423a-8ce8-31ccc2068037-kube-api-access-9278s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.463470 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9278s\" (UniqueName: \"kubernetes.io/projected/9b428893-75cc-423a-8ce8-31ccc2068037-kube-api-access-9278s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.463631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.463728 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.470510 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.473179 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.479115 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9278s\" (UniqueName: \"kubernetes.io/projected/9b428893-75cc-423a-8ce8-31ccc2068037-kube-api-access-9278s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-grthd\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:12 crc kubenswrapper[4797]: I0930 18:17:12.621255 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:13 crc kubenswrapper[4797]: I0930 18:17:13.226420 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd"] Sep 30 18:17:14 crc kubenswrapper[4797]: I0930 18:17:14.199764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" event={"ID":"9b428893-75cc-423a-8ce8-31ccc2068037","Type":"ContainerStarted","Data":"41c81450eadf3259b217a9c7c6620e4c488ad12bd00d1af8f1998b97c143412c"} Sep 30 18:17:14 crc kubenswrapper[4797]: I0930 18:17:14.200124 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" event={"ID":"9b428893-75cc-423a-8ce8-31ccc2068037","Type":"ContainerStarted","Data":"bce048f0d9d4a15c41f9750451e86537003e64f25d34c87315e9fcb4522f4a0c"} Sep 30 18:17:14 crc kubenswrapper[4797]: I0930 18:17:14.226871 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" podStartSLOduration=1.5946247059999998 podStartE2EDuration="2.226852627s" podCreationTimestamp="2025-09-30 18:17:12 +0000 UTC" firstStartedPulling="2025-09-30 18:17:13.228968693 +0000 UTC m=+2083.751467931" lastFinishedPulling="2025-09-30 18:17:13.861196604 +0000 UTC m=+2084.383695852" observedRunningTime="2025-09-30 18:17:14.224840832 +0000 UTC m=+2084.747340110" watchObservedRunningTime="2025-09-30 18:17:14.226852627 +0000 UTC m=+2084.749351875" Sep 30 18:17:24 crc kubenswrapper[4797]: I0930 18:17:24.324964 4797 generic.go:334] "Generic (PLEG): container finished" podID="9b428893-75cc-423a-8ce8-31ccc2068037" containerID="41c81450eadf3259b217a9c7c6620e4c488ad12bd00d1af8f1998b97c143412c" exitCode=0 Sep 30 18:17:24 crc kubenswrapper[4797]: I0930 18:17:24.325077 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" event={"ID":"9b428893-75cc-423a-8ce8-31ccc2068037","Type":"ContainerDied","Data":"41c81450eadf3259b217a9c7c6620e4c488ad12bd00d1af8f1998b97c143412c"} Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.819810 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.952283 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9278s\" (UniqueName: \"kubernetes.io/projected/9b428893-75cc-423a-8ce8-31ccc2068037-kube-api-access-9278s\") pod \"9b428893-75cc-423a-8ce8-31ccc2068037\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.952376 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-inventory\") pod \"9b428893-75cc-423a-8ce8-31ccc2068037\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.952528 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-ssh-key\") pod \"9b428893-75cc-423a-8ce8-31ccc2068037\" (UID: \"9b428893-75cc-423a-8ce8-31ccc2068037\") " Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.959759 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b428893-75cc-423a-8ce8-31ccc2068037-kube-api-access-9278s" (OuterVolumeSpecName: "kube-api-access-9278s") pod "9b428893-75cc-423a-8ce8-31ccc2068037" (UID: "9b428893-75cc-423a-8ce8-31ccc2068037"). InnerVolumeSpecName "kube-api-access-9278s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.980689 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-inventory" (OuterVolumeSpecName: "inventory") pod "9b428893-75cc-423a-8ce8-31ccc2068037" (UID: "9b428893-75cc-423a-8ce8-31ccc2068037"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:17:25 crc kubenswrapper[4797]: I0930 18:17:25.989341 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b428893-75cc-423a-8ce8-31ccc2068037" (UID: "9b428893-75cc-423a-8ce8-31ccc2068037"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.055952 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9278s\" (UniqueName: \"kubernetes.io/projected/9b428893-75cc-423a-8ce8-31ccc2068037-kube-api-access-9278s\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.056018 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.056037 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b428893-75cc-423a-8ce8-31ccc2068037-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.346729 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" event={"ID":"9b428893-75cc-423a-8ce8-31ccc2068037","Type":"ContainerDied","Data":"bce048f0d9d4a15c41f9750451e86537003e64f25d34c87315e9fcb4522f4a0c"} Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.347230 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce048f0d9d4a15c41f9750451e86537003e64f25d34c87315e9fcb4522f4a0c" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.346828 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-grthd" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.451234 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb"] Sep 30 18:17:26 crc kubenswrapper[4797]: E0930 18:17:26.451725 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b428893-75cc-423a-8ce8-31ccc2068037" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.451745 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b428893-75cc-423a-8ce8-31ccc2068037" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.451976 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b428893-75cc-423a-8ce8-31ccc2068037" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.452683 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.454380 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.454894 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.455365 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.455902 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.464141 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb"] Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.572842 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.572985 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmmg6\" (UniqueName: \"kubernetes.io/projected/1c7d707f-71bd-4194-b7c2-14a592f9772c-kube-api-access-qmmg6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.573204 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.675946 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.676090 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.676251 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmmg6\" (UniqueName: \"kubernetes.io/projected/1c7d707f-71bd-4194-b7c2-14a592f9772c-kube-api-access-qmmg6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.682812 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.684916 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.712474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmmg6\" (UniqueName: \"kubernetes.io/projected/1c7d707f-71bd-4194-b7c2-14a592f9772c-kube-api-access-qmmg6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:26 crc kubenswrapper[4797]: I0930 18:17:26.809910 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:27 crc kubenswrapper[4797]: I0930 18:17:27.211790 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb"] Sep 30 18:17:27 crc kubenswrapper[4797]: I0930 18:17:27.356362 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" event={"ID":"1c7d707f-71bd-4194-b7c2-14a592f9772c","Type":"ContainerStarted","Data":"118562407bc0d345c578007662b0b5fb2939ca149118f2b0e63a5920677a88ec"} Sep 30 18:17:28 crc kubenswrapper[4797]: I0930 18:17:28.369477 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" event={"ID":"1c7d707f-71bd-4194-b7c2-14a592f9772c","Type":"ContainerStarted","Data":"5c99fc24371961f0189d91b97a3b7b66e2383e55d1e1a8b977deaabc0c3b2977"} Sep 30 18:17:28 crc kubenswrapper[4797]: I0930 18:17:28.398123 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" podStartSLOduration=1.987826465 podStartE2EDuration="2.398105499s" podCreationTimestamp="2025-09-30 18:17:26 +0000 UTC" firstStartedPulling="2025-09-30 18:17:27.2210499 +0000 UTC m=+2097.743549138" lastFinishedPulling="2025-09-30 18:17:27.631328914 +0000 UTC m=+2098.153828172" observedRunningTime="2025-09-30 18:17:28.394530832 +0000 UTC m=+2098.917030100" watchObservedRunningTime="2025-09-30 18:17:28.398105499 +0000 UTC m=+2098.920604737" Sep 30 18:17:38 crc kubenswrapper[4797]: I0930 18:17:38.474754 4797 generic.go:334] "Generic (PLEG): container finished" podID="1c7d707f-71bd-4194-b7c2-14a592f9772c" containerID="5c99fc24371961f0189d91b97a3b7b66e2383e55d1e1a8b977deaabc0c3b2977" exitCode=0 Sep 30 18:17:38 crc kubenswrapper[4797]: I0930 18:17:38.474863 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" event={"ID":"1c7d707f-71bd-4194-b7c2-14a592f9772c","Type":"ContainerDied","Data":"5c99fc24371961f0189d91b97a3b7b66e2383e55d1e1a8b977deaabc0c3b2977"} Sep 30 18:17:39 crc kubenswrapper[4797]: I0930 18:17:39.999078 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.070499 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmmg6\" (UniqueName: \"kubernetes.io/projected/1c7d707f-71bd-4194-b7c2-14a592f9772c-kube-api-access-qmmg6\") pod \"1c7d707f-71bd-4194-b7c2-14a592f9772c\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.070654 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-ssh-key\") pod \"1c7d707f-71bd-4194-b7c2-14a592f9772c\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.070697 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-inventory\") pod \"1c7d707f-71bd-4194-b7c2-14a592f9772c\" (UID: \"1c7d707f-71bd-4194-b7c2-14a592f9772c\") " Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.078614 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7d707f-71bd-4194-b7c2-14a592f9772c-kube-api-access-qmmg6" (OuterVolumeSpecName: "kube-api-access-qmmg6") pod "1c7d707f-71bd-4194-b7c2-14a592f9772c" (UID: "1c7d707f-71bd-4194-b7c2-14a592f9772c"). InnerVolumeSpecName "kube-api-access-qmmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.098098 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c7d707f-71bd-4194-b7c2-14a592f9772c" (UID: "1c7d707f-71bd-4194-b7c2-14a592f9772c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.099954 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-inventory" (OuterVolumeSpecName: "inventory") pod "1c7d707f-71bd-4194-b7c2-14a592f9772c" (UID: "1c7d707f-71bd-4194-b7c2-14a592f9772c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.172892 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmmg6\" (UniqueName: \"kubernetes.io/projected/1c7d707f-71bd-4194-b7c2-14a592f9772c-kube-api-access-qmmg6\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.172927 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.172937 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c7d707f-71bd-4194-b7c2-14a592f9772c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.499139 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" event={"ID":"1c7d707f-71bd-4194-b7c2-14a592f9772c","Type":"ContainerDied","Data":"118562407bc0d345c578007662b0b5fb2939ca149118f2b0e63a5920677a88ec"} Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.499180 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="118562407bc0d345c578007662b0b5fb2939ca149118f2b0e63a5920677a88ec" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.499638 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.637106 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7"] Sep 30 18:17:40 crc kubenswrapper[4797]: E0930 18:17:40.637816 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7d707f-71bd-4194-b7c2-14a592f9772c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.637849 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7d707f-71bd-4194-b7c2-14a592f9772c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.638264 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7d707f-71bd-4194-b7c2-14a592f9772c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.639347 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.646019 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.646421 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.646682 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.647941 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.648516 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.648785 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.649314 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.649590 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.653611 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7"] Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685249 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685309 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685373 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685408 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685502 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685531 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685558 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685632 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685697 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685772 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685898 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.685962 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.686012 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnc4\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-kube-api-access-tnnc4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.686099 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.787866 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.787994 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788026 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788072 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788101 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788147 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788169 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788192 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788243 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788284 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788306 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788331 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.788410 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnc4\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-kube-api-access-tnnc4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.792373 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.793200 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.793767 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.793785 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.793972 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.794164 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.795030 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.796232 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.796680 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.796899 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.797727 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.797783 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.797895 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.811126 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnc4\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-kube-api-access-tnnc4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65dn7\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:40 crc kubenswrapper[4797]: I0930 18:17:40.963063 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:17:41 crc kubenswrapper[4797]: I0930 18:17:41.557957 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7"] Sep 30 18:17:42 crc kubenswrapper[4797]: I0930 18:17:42.527348 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" event={"ID":"047af11f-ac90-41df-96c5-be75581aff10","Type":"ContainerStarted","Data":"94c0c0f07cfae01d9387988f5f0823a916e5f75eac1d40a404139475f4fbb5d2"} Sep 30 18:17:42 crc kubenswrapper[4797]: I0930 18:17:42.527617 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" event={"ID":"047af11f-ac90-41df-96c5-be75581aff10","Type":"ContainerStarted","Data":"a7dd46190d60d33740057a7cdb92265271b8e64b4419e96e67aba7020801f907"} Sep 30 18:17:42 crc kubenswrapper[4797]: I0930 18:17:42.558842 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" podStartSLOduration=2.1236201279999998 podStartE2EDuration="2.558779433s" podCreationTimestamp="2025-09-30 18:17:40 +0000 UTC" firstStartedPulling="2025-09-30 18:17:41.560334844 +0000 UTC m=+2112.082834072" lastFinishedPulling="2025-09-30 18:17:41.995494139 +0000 UTC m=+2112.517993377" observedRunningTime="2025-09-30 18:17:42.555631877 +0000 UTC m=+2113.078131175" watchObservedRunningTime="2025-09-30 18:17:42.558779433 +0000 UTC m=+2113.081278711" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.211249 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkb"] Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.215150 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.222941 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkb"] Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.251021 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxdf\" (UniqueName: \"kubernetes.io/projected/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-kube-api-access-lbxdf\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.251092 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-utilities\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.251153 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-catalog-content\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.353460 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxdf\" (UniqueName: \"kubernetes.io/projected/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-kube-api-access-lbxdf\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.353529 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-utilities\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.353629 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-catalog-content\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.354172 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-utilities\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.354906 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-catalog-content\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.383563 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxdf\" (UniqueName: \"kubernetes.io/projected/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-kube-api-access-lbxdf\") pod \"redhat-marketplace-8nfkb\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:53 crc kubenswrapper[4797]: I0930 18:17:53.556058 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:17:54 crc kubenswrapper[4797]: W0930 18:17:54.075627 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b080b7b_57f0_4dc1_b6d2_a7a18b0c35df.slice/crio-3b115a8fac9452c5034e3cee6e42d14ebbe966ecc819d556c43308aa07a84602 WatchSource:0}: Error finding container 3b115a8fac9452c5034e3cee6e42d14ebbe966ecc819d556c43308aa07a84602: Status 404 returned error can't find the container with id 3b115a8fac9452c5034e3cee6e42d14ebbe966ecc819d556c43308aa07a84602 Sep 30 18:17:54 crc kubenswrapper[4797]: I0930 18:17:54.075743 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkb"] Sep 30 18:17:54 crc kubenswrapper[4797]: I0930 18:17:54.666195 4797 generic.go:334] "Generic (PLEG): container finished" podID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerID="9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798" exitCode=0 Sep 30 18:17:54 crc kubenswrapper[4797]: I0930 18:17:54.666459 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerDied","Data":"9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798"} Sep 30 18:17:54 crc kubenswrapper[4797]: I0930 18:17:54.666586 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerStarted","Data":"3b115a8fac9452c5034e3cee6e42d14ebbe966ecc819d556c43308aa07a84602"} Sep 30 18:17:55 crc kubenswrapper[4797]: I0930 18:17:55.677513 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerStarted","Data":"39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4"} Sep 30 18:17:56 crc kubenswrapper[4797]: I0930 18:17:56.687468 4797 generic.go:334] "Generic (PLEG): container finished" podID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerID="39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4" exitCode=0 Sep 30 18:17:56 crc kubenswrapper[4797]: I0930 18:17:56.687519 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerDied","Data":"39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4"} Sep 30 18:17:57 crc kubenswrapper[4797]: I0930 18:17:57.701377 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerStarted","Data":"26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af"} Sep 30 18:17:57 crc kubenswrapper[4797]: I0930 18:17:57.732764 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8nfkb" podStartSLOduration=2.272271989 podStartE2EDuration="4.732739522s" podCreationTimestamp="2025-09-30 18:17:53 +0000 UTC" firstStartedPulling="2025-09-30 18:17:54.6697123 +0000 UTC m=+2125.192211538" lastFinishedPulling="2025-09-30 18:17:57.130179833 +0000 UTC m=+2127.652679071" observedRunningTime="2025-09-30 18:17:57.727282932 +0000 UTC m=+2128.249782170" watchObservedRunningTime="2025-09-30 18:17:57.732739522 +0000 UTC m=+2128.255238800" Sep 30 18:18:03 crc kubenswrapper[4797]: I0930 18:18:03.556271 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:18:03 crc kubenswrapper[4797]: I0930 18:18:03.556933 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:18:03 crc kubenswrapper[4797]: I0930 18:18:03.624134 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:18:03 crc kubenswrapper[4797]: I0930 18:18:03.825227 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:18:03 crc kubenswrapper[4797]: I0930 18:18:03.885235 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkb"] Sep 30 18:18:05 crc kubenswrapper[4797]: I0930 18:18:05.781588 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8nfkb" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="registry-server" containerID="cri-o://26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af" gracePeriod=2 Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.263769 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.337084 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-catalog-content\") pod \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.337413 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbxdf\" (UniqueName: \"kubernetes.io/projected/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-kube-api-access-lbxdf\") pod \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.337561 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-utilities\") pod \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\" (UID: \"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df\") " Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.338282 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-utilities" (OuterVolumeSpecName: "utilities") pod "2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" (UID: "2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.343888 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-kube-api-access-lbxdf" (OuterVolumeSpecName: "kube-api-access-lbxdf") pod "2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" (UID: "2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df"). InnerVolumeSpecName "kube-api-access-lbxdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.349766 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" (UID: "2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.439488 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbxdf\" (UniqueName: \"kubernetes.io/projected/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-kube-api-access-lbxdf\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.439525 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.439540 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.795011 4797 generic.go:334] "Generic (PLEG): container finished" podID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerID="26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af" exitCode=0 Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.795098 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerDied","Data":"26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af"} Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.795137 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkb" event={"ID":"2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df","Type":"ContainerDied","Data":"3b115a8fac9452c5034e3cee6e42d14ebbe966ecc819d556c43308aa07a84602"} Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.795168 4797 scope.go:117] "RemoveContainer" containerID="26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.795364 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkb" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.858782 4797 scope.go:117] "RemoveContainer" containerID="39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.861244 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkb"] Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.879863 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkb"] Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.883518 4797 scope.go:117] "RemoveContainer" containerID="9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.933309 4797 scope.go:117] "RemoveContainer" containerID="26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af" Sep 30 18:18:06 crc kubenswrapper[4797]: E0930 18:18:06.934037 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af\": container with ID starting with 26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af not found: ID does not exist" containerID="26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.934070 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af"} err="failed to get container status \"26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af\": rpc error: code = NotFound desc = could not find container \"26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af\": container with ID starting with 26c3acd0e253119f196d886ada08f89770f1fcda333130ecfdc688d3bef840af not found: ID does not exist" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.934090 4797 scope.go:117] "RemoveContainer" containerID="39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4" Sep 30 18:18:06 crc kubenswrapper[4797]: E0930 18:18:06.934661 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4\": container with ID starting with 39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4 not found: ID does not exist" containerID="39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.934713 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4"} err="failed to get container status \"39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4\": rpc error: code = NotFound desc = could not find container \"39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4\": container with ID starting with 39e046e5c46189cfe6c164e618461340d3a4bbde47bdfd503695b3d979c42fb4 not found: ID does not exist" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.934746 4797 scope.go:117] "RemoveContainer" containerID="9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798" Sep 30 18:18:06 crc kubenswrapper[4797]: E0930 18:18:06.935159 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798\": container with ID starting with 9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798 not found: ID does not exist" containerID="9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798" Sep 30 18:18:06 crc kubenswrapper[4797]: I0930 18:18:06.935197 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798"} err="failed to get container status \"9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798\": rpc error: code = NotFound desc = could not find container \"9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798\": container with ID starting with 9f33d58e20733712983ed2241719f1e432f4b6ab1b3abf79bf65e8275e9dd798 not found: ID does not exist" Sep 30 18:18:08 crc kubenswrapper[4797]: I0930 18:18:08.253169 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" path="/var/lib/kubelet/pods/2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df/volumes" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.645544 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-65m8q"] Sep 30 18:18:16 crc kubenswrapper[4797]: E0930 18:18:16.646551 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="registry-server" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.646568 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="registry-server" Sep 30 18:18:16 crc kubenswrapper[4797]: E0930 18:18:16.646612 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="extract-utilities" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.646623 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="extract-utilities" Sep 30 18:18:16 crc kubenswrapper[4797]: E0930 18:18:16.646668 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="extract-content" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.646676 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="extract-content" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.646904 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b080b7b-57f0-4dc1-b6d2-a7a18b0c35df" containerName="registry-server" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.649021 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.668412 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65m8q"] Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.759122 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzsvl\" (UniqueName: \"kubernetes.io/projected/f0bc0b07-ef97-4560-97d8-44ba78047d8b-kube-api-access-xzsvl\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.759184 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-catalog-content\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.759408 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-utilities\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.861388 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-catalog-content\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.861645 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-utilities\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.861738 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzsvl\" (UniqueName: \"kubernetes.io/projected/f0bc0b07-ef97-4560-97d8-44ba78047d8b-kube-api-access-xzsvl\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.862675 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-catalog-content\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.863096 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-utilities\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.894659 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzsvl\" (UniqueName: \"kubernetes.io/projected/f0bc0b07-ef97-4560-97d8-44ba78047d8b-kube-api-access-xzsvl\") pod \"community-operators-65m8q\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:16 crc kubenswrapper[4797]: I0930 18:18:16.979405 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:17 crc kubenswrapper[4797]: I0930 18:18:17.349782 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65m8q"] Sep 30 18:18:17 crc kubenswrapper[4797]: I0930 18:18:17.926243 4797 generic.go:334] "Generic (PLEG): container finished" podID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerID="954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16" exitCode=0 Sep 30 18:18:17 crc kubenswrapper[4797]: I0930 18:18:17.926535 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65m8q" event={"ID":"f0bc0b07-ef97-4560-97d8-44ba78047d8b","Type":"ContainerDied","Data":"954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16"} Sep 30 18:18:17 crc kubenswrapper[4797]: I0930 18:18:17.926664 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65m8q" event={"ID":"f0bc0b07-ef97-4560-97d8-44ba78047d8b","Type":"ContainerStarted","Data":"e1bac2a97e190586278831f58caee60dbd94cd041e094797a653cecf6f6e5e93"} Sep 30 18:18:19 crc kubenswrapper[4797]: I0930 18:18:19.946242 4797 generic.go:334] "Generic (PLEG): container finished" podID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerID="6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d" exitCode=0 Sep 30 18:18:19 crc kubenswrapper[4797]: I0930 18:18:19.946326 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65m8q" event={"ID":"f0bc0b07-ef97-4560-97d8-44ba78047d8b","Type":"ContainerDied","Data":"6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d"} Sep 30 18:18:20 crc kubenswrapper[4797]: I0930 18:18:20.958188 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65m8q" event={"ID":"f0bc0b07-ef97-4560-97d8-44ba78047d8b","Type":"ContainerStarted","Data":"e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d"} Sep 30 18:18:20 crc kubenswrapper[4797]: I0930 18:18:20.982536 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-65m8q" podStartSLOduration=2.376531997 podStartE2EDuration="4.982514303s" podCreationTimestamp="2025-09-30 18:18:16 +0000 UTC" firstStartedPulling="2025-09-30 18:18:17.929603982 +0000 UTC m=+2148.452103250" lastFinishedPulling="2025-09-30 18:18:20.535586298 +0000 UTC m=+2151.058085556" observedRunningTime="2025-09-30 18:18:20.979907932 +0000 UTC m=+2151.502407190" watchObservedRunningTime="2025-09-30 18:18:20.982514303 +0000 UTC m=+2151.505013571" Sep 30 18:18:26 crc kubenswrapper[4797]: I0930 18:18:26.980310 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:26 crc kubenswrapper[4797]: I0930 18:18:26.981232 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:27 crc kubenswrapper[4797]: I0930 18:18:27.033235 4797 generic.go:334] "Generic (PLEG): container finished" podID="047af11f-ac90-41df-96c5-be75581aff10" containerID="94c0c0f07cfae01d9387988f5f0823a916e5f75eac1d40a404139475f4fbb5d2" exitCode=0 Sep 30 18:18:27 crc kubenswrapper[4797]: I0930 18:18:27.033296 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" event={"ID":"047af11f-ac90-41df-96c5-be75581aff10","Type":"ContainerDied","Data":"94c0c0f07cfae01d9387988f5f0823a916e5f75eac1d40a404139475f4fbb5d2"} Sep 30 18:18:27 crc kubenswrapper[4797]: I0930 18:18:27.052195 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:27 crc kubenswrapper[4797]: I0930 18:18:27.123226 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:27 crc kubenswrapper[4797]: I0930 18:18:27.316707 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65m8q"] Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.513798 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.614642 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ovn-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.614739 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-bootstrap-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.614817 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-ovn-default-certs-0\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.614854 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.614882 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.614921 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-libvirt-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615005 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-telemetry-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615037 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ssh-key\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615105 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-repo-setup-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615132 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615185 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-nova-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615223 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnnc4\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-kube-api-access-tnnc4\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615254 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-neutron-metadata-combined-ca-bundle\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.615290 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-inventory\") pod \"047af11f-ac90-41df-96c5-be75581aff10\" (UID: \"047af11f-ac90-41df-96c5-be75581aff10\") " Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.633633 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.633705 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.633784 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.645671 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.646258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.646862 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.647982 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.648676 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.649853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-kube-api-access-tnnc4" (OuterVolumeSpecName: "kube-api-access-tnnc4") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "kube-api-access-tnnc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.650208 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.652932 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.661609 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.693186 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-inventory" (OuterVolumeSpecName: "inventory") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.695370 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "047af11f-ac90-41df-96c5-be75581aff10" (UID: "047af11f-ac90-41df-96c5-be75581aff10"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.717968 4797 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718014 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnnc4\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-kube-api-access-tnnc4\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718028 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718046 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718058 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718069 4797 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718078 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718090 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718104 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718115 4797 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718124 4797 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718134 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718146 4797 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047af11f-ac90-41df-96c5-be75581aff10-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:28 crc kubenswrapper[4797]: I0930 18:18:28.718157 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/047af11f-ac90-41df-96c5-be75581aff10-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.054748 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.054760 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65dn7" event={"ID":"047af11f-ac90-41df-96c5-be75581aff10","Type":"ContainerDied","Data":"a7dd46190d60d33740057a7cdb92265271b8e64b4419e96e67aba7020801f907"} Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.054845 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7dd46190d60d33740057a7cdb92265271b8e64b4419e96e67aba7020801f907" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.054910 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-65m8q" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="registry-server" containerID="cri-o://e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d" gracePeriod=2 Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.210407 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9"] Sep 30 18:18:29 crc kubenswrapper[4797]: E0930 18:18:29.210919 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047af11f-ac90-41df-96c5-be75581aff10" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.210946 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="047af11f-ac90-41df-96c5-be75581aff10" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.211217 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="047af11f-ac90-41df-96c5-be75581aff10" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.211987 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.214260 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.214576 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.214986 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.215009 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.215883 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.233740 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.233811 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.233848 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.233906 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.233961 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wjs\" (UniqueName: \"kubernetes.io/projected/3d61414a-adba-4fcd-b3ca-417935b2c4db-kube-api-access-44wjs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.236420 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9"] Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.336058 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.336173 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.336207 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.336323 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.336427 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wjs\" (UniqueName: \"kubernetes.io/projected/3d61414a-adba-4fcd-b3ca-417935b2c4db-kube-api-access-44wjs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.340695 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.340848 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.342495 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.344308 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.361051 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wjs\" (UniqueName: \"kubernetes.io/projected/3d61414a-adba-4fcd-b3ca-417935b2c4db-kube-api-access-44wjs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dmdz9\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.580810 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.582487 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.640797 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-catalog-content\") pod \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.640841 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzsvl\" (UniqueName: \"kubernetes.io/projected/f0bc0b07-ef97-4560-97d8-44ba78047d8b-kube-api-access-xzsvl\") pod \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.640900 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-utilities\") pod \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\" (UID: \"f0bc0b07-ef97-4560-97d8-44ba78047d8b\") " Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.642301 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-utilities" (OuterVolumeSpecName: "utilities") pod "f0bc0b07-ef97-4560-97d8-44ba78047d8b" (UID: "f0bc0b07-ef97-4560-97d8-44ba78047d8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.656244 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0bc0b07-ef97-4560-97d8-44ba78047d8b-kube-api-access-xzsvl" (OuterVolumeSpecName: "kube-api-access-xzsvl") pod "f0bc0b07-ef97-4560-97d8-44ba78047d8b" (UID: "f0bc0b07-ef97-4560-97d8-44ba78047d8b"). InnerVolumeSpecName "kube-api-access-xzsvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.746193 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzsvl\" (UniqueName: \"kubernetes.io/projected/f0bc0b07-ef97-4560-97d8-44ba78047d8b-kube-api-access-xzsvl\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.746220 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.853159 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0bc0b07-ef97-4560-97d8-44ba78047d8b" (UID: "f0bc0b07-ef97-4560-97d8-44ba78047d8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:18:29 crc kubenswrapper[4797]: I0930 18:18:29.949395 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc0b07-ef97-4560-97d8-44ba78047d8b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.068994 4797 generic.go:334] "Generic (PLEG): container finished" podID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerID="e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d" exitCode=0 Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.069041 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65m8q" event={"ID":"f0bc0b07-ef97-4560-97d8-44ba78047d8b","Type":"ContainerDied","Data":"e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d"} Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.069091 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65m8q" event={"ID":"f0bc0b07-ef97-4560-97d8-44ba78047d8b","Type":"ContainerDied","Data":"e1bac2a97e190586278831f58caee60dbd94cd041e094797a653cecf6f6e5e93"} Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.069122 4797 scope.go:117] "RemoveContainer" containerID="e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.069179 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65m8q" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.108249 4797 scope.go:117] "RemoveContainer" containerID="6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.135015 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65m8q"] Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.145288 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-65m8q"] Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.154165 4797 scope.go:117] "RemoveContainer" containerID="954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.190662 4797 scope.go:117] "RemoveContainer" containerID="e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d" Sep 30 18:18:30 crc kubenswrapper[4797]: E0930 18:18:30.191271 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d\": container with ID starting with e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d not found: ID does not exist" containerID="e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.191318 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d"} err="failed to get container status \"e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d\": rpc error: code = NotFound desc = could not find container \"e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d\": container with ID starting with e40808b14699c7b6148935bc3d9c07816ba7631c418b5347cda67e817d1d2e0d not found: ID does not exist" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.191350 4797 scope.go:117] "RemoveContainer" containerID="6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d" Sep 30 18:18:30 crc kubenswrapper[4797]: E0930 18:18:30.191853 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d\": container with ID starting with 6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d not found: ID does not exist" containerID="6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.191925 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d"} err="failed to get container status \"6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d\": rpc error: code = NotFound desc = could not find container \"6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d\": container with ID starting with 6b1d96359542946bc6b95cf54326e4eb035a0654d40e81dc4b65a578593e946d not found: ID does not exist" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.191969 4797 scope.go:117] "RemoveContainer" containerID="954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16" Sep 30 18:18:30 crc kubenswrapper[4797]: E0930 18:18:30.192513 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16\": container with ID starting with 954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16 not found: ID does not exist" containerID="954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.192549 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16"} err="failed to get container status \"954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16\": rpc error: code = NotFound desc = could not find container \"954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16\": container with ID starting with 954c0611776f4e1a014ccd0c6fe4ed61fc7b0fdbe406f19cd79bb004df341f16 not found: ID does not exist" Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.197644 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9"] Sep 30 18:18:30 crc kubenswrapper[4797]: I0930 18:18:30.260921 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" path="/var/lib/kubelet/pods/f0bc0b07-ef97-4560-97d8-44ba78047d8b/volumes" Sep 30 18:18:31 crc kubenswrapper[4797]: I0930 18:18:31.080990 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" event={"ID":"3d61414a-adba-4fcd-b3ca-417935b2c4db","Type":"ContainerStarted","Data":"37e4c9425b2c0f6bc09e77e4e1f8621d8b56da78379dceb8242be5dbb703ce4f"} Sep 30 18:18:31 crc kubenswrapper[4797]: I0930 18:18:31.372717 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:18:32 crc kubenswrapper[4797]: I0930 18:18:32.098122 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" event={"ID":"3d61414a-adba-4fcd-b3ca-417935b2c4db","Type":"ContainerStarted","Data":"102e937e81f82310301af48cc0e08f0c884ecc355da3e01695cc43ed8b3baee4"} Sep 30 18:18:32 crc kubenswrapper[4797]: I0930 18:18:32.139407 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" podStartSLOduration=1.974398494 podStartE2EDuration="3.139344481s" podCreationTimestamp="2025-09-30 18:18:29 +0000 UTC" firstStartedPulling="2025-09-30 18:18:30.204250964 +0000 UTC m=+2160.726750212" lastFinishedPulling="2025-09-30 18:18:31.369196941 +0000 UTC m=+2161.891696199" observedRunningTime="2025-09-30 18:18:32.120636438 +0000 UTC m=+2162.643135706" watchObservedRunningTime="2025-09-30 18:18:32.139344481 +0000 UTC m=+2162.661843789" Sep 30 18:18:44 crc kubenswrapper[4797]: I0930 18:18:44.191482 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:18:44 crc kubenswrapper[4797]: I0930 18:18:44.192284 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:19:14 crc kubenswrapper[4797]: I0930 18:19:14.191930 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:19:14 crc kubenswrapper[4797]: I0930 18:19:14.194677 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.191765 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.192533 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.192601 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.193731 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.193854 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" gracePeriod=600 Sep 30 18:19:44 crc kubenswrapper[4797]: E0930 18:19:44.332107 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.904240 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" exitCode=0 Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.904351 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18"} Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.904669 4797 scope.go:117] "RemoveContainer" containerID="cf32d40964fc39c3364a5ea8bb7a60c96ff0ac48043bce05f91c1b37dde2e113" Sep 30 18:19:44 crc kubenswrapper[4797]: I0930 18:19:44.905765 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:19:44 crc kubenswrapper[4797]: E0930 18:19:44.906233 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:19:45 crc kubenswrapper[4797]: I0930 18:19:45.914019 4797 generic.go:334] "Generic (PLEG): container finished" podID="3d61414a-adba-4fcd-b3ca-417935b2c4db" containerID="102e937e81f82310301af48cc0e08f0c884ecc355da3e01695cc43ed8b3baee4" exitCode=0 Sep 30 18:19:45 crc kubenswrapper[4797]: I0930 18:19:45.914105 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" event={"ID":"3d61414a-adba-4fcd-b3ca-417935b2c4db","Type":"ContainerDied","Data":"102e937e81f82310301af48cc0e08f0c884ecc355da3e01695cc43ed8b3baee4"} Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.384657 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.483402 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ssh-key\") pod \"3d61414a-adba-4fcd-b3ca-417935b2c4db\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.483534 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovncontroller-config-0\") pod \"3d61414a-adba-4fcd-b3ca-417935b2c4db\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.483852 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovn-combined-ca-bundle\") pod \"3d61414a-adba-4fcd-b3ca-417935b2c4db\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.483926 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44wjs\" (UniqueName: \"kubernetes.io/projected/3d61414a-adba-4fcd-b3ca-417935b2c4db-kube-api-access-44wjs\") pod \"3d61414a-adba-4fcd-b3ca-417935b2c4db\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.483956 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-inventory\") pod \"3d61414a-adba-4fcd-b3ca-417935b2c4db\" (UID: \"3d61414a-adba-4fcd-b3ca-417935b2c4db\") " Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.492734 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3d61414a-adba-4fcd-b3ca-417935b2c4db" (UID: "3d61414a-adba-4fcd-b3ca-417935b2c4db"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.492842 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d61414a-adba-4fcd-b3ca-417935b2c4db-kube-api-access-44wjs" (OuterVolumeSpecName: "kube-api-access-44wjs") pod "3d61414a-adba-4fcd-b3ca-417935b2c4db" (UID: "3d61414a-adba-4fcd-b3ca-417935b2c4db"). InnerVolumeSpecName "kube-api-access-44wjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.514989 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3d61414a-adba-4fcd-b3ca-417935b2c4db" (UID: "3d61414a-adba-4fcd-b3ca-417935b2c4db"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.534042 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3d61414a-adba-4fcd-b3ca-417935b2c4db" (UID: "3d61414a-adba-4fcd-b3ca-417935b2c4db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.540893 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-inventory" (OuterVolumeSpecName: "inventory") pod "3d61414a-adba-4fcd-b3ca-417935b2c4db" (UID: "3d61414a-adba-4fcd-b3ca-417935b2c4db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.586966 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.587027 4797 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.587053 4797 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.587076 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44wjs\" (UniqueName: \"kubernetes.io/projected/3d61414a-adba-4fcd-b3ca-417935b2c4db-kube-api-access-44wjs\") on node \"crc\" DevicePath \"\"" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.587143 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61414a-adba-4fcd-b3ca-417935b2c4db-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.939782 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" event={"ID":"3d61414a-adba-4fcd-b3ca-417935b2c4db","Type":"ContainerDied","Data":"37e4c9425b2c0f6bc09e77e4e1f8621d8b56da78379dceb8242be5dbb703ce4f"} Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.939822 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dmdz9" Sep 30 18:19:47 crc kubenswrapper[4797]: I0930 18:19:47.939824 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e4c9425b2c0f6bc09e77e4e1f8621d8b56da78379dceb8242be5dbb703ce4f" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.066815 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q"] Sep 30 18:19:48 crc kubenswrapper[4797]: E0930 18:19:48.067592 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="registry-server" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.067637 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="registry-server" Sep 30 18:19:48 crc kubenswrapper[4797]: E0930 18:19:48.067681 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="extract-content" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.067698 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="extract-content" Sep 30 18:19:48 crc kubenswrapper[4797]: E0930 18:19:48.067740 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="extract-utilities" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.067757 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="extract-utilities" Sep 30 18:19:48 crc kubenswrapper[4797]: E0930 18:19:48.067805 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d61414a-adba-4fcd-b3ca-417935b2c4db" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.067822 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d61414a-adba-4fcd-b3ca-417935b2c4db" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.068299 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d61414a-adba-4fcd-b3ca-417935b2c4db" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.068376 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0bc0b07-ef97-4560-97d8-44ba78047d8b" containerName="registry-server" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.069935 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.076502 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.076904 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.077199 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.077709 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.078185 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.078487 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q"] Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.078710 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.203285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.203517 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqbv\" (UniqueName: \"kubernetes.io/projected/93bccc4f-33ad-45b9-9549-20ba5484888f-kube-api-access-pqqbv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.203670 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.203726 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.203960 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.204576 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.305910 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.306197 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.306233 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqbv\" (UniqueName: \"kubernetes.io/projected/93bccc4f-33ad-45b9-9549-20ba5484888f-kube-api-access-pqqbv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.306267 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.306287 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.306344 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.312843 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.312993 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.313092 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.313697 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.319530 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.325593 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqbv\" (UniqueName: \"kubernetes.io/projected/93bccc4f-33ad-45b9-9549-20ba5484888f-kube-api-access-pqqbv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.401350 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:19:48 crc kubenswrapper[4797]: I0930 18:19:48.978233 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q"] Sep 30 18:19:49 crc kubenswrapper[4797]: I0930 18:19:49.962930 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" event={"ID":"93bccc4f-33ad-45b9-9549-20ba5484888f","Type":"ContainerStarted","Data":"29fbe5d95ab8779f63a1a3e1639f104c7eb66fdb87bcc29b2865e2df64dd7c68"} Sep 30 18:19:49 crc kubenswrapper[4797]: I0930 18:19:49.964543 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" event={"ID":"93bccc4f-33ad-45b9-9549-20ba5484888f","Type":"ContainerStarted","Data":"19395efe89b516698a9955df3c33d12fede7c54582704dbbfb9b9cb425f5dcf8"} Sep 30 18:19:49 crc kubenswrapper[4797]: I0930 18:19:49.987837 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" podStartSLOduration=1.5357553739999998 podStartE2EDuration="1.987810781s" podCreationTimestamp="2025-09-30 18:19:48 +0000 UTC" firstStartedPulling="2025-09-30 18:19:48.99372181 +0000 UTC m=+2239.516221088" lastFinishedPulling="2025-09-30 18:19:49.445777217 +0000 UTC m=+2239.968276495" observedRunningTime="2025-09-30 18:19:49.98230819 +0000 UTC m=+2240.504807438" watchObservedRunningTime="2025-09-30 18:19:49.987810781 +0000 UTC m=+2240.510310029" Sep 30 18:19:58 crc kubenswrapper[4797]: I0930 18:19:58.239233 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:19:58 crc kubenswrapper[4797]: E0930 18:19:58.240085 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:20:13 crc kubenswrapper[4797]: I0930 18:20:13.238169 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:20:13 crc kubenswrapper[4797]: E0930 18:20:13.239208 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:20:24 crc kubenswrapper[4797]: I0930 18:20:24.239612 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:20:24 crc kubenswrapper[4797]: E0930 18:20:24.242178 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:20:37 crc kubenswrapper[4797]: I0930 18:20:37.239226 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:20:37 crc kubenswrapper[4797]: E0930 18:20:37.240287 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:20:46 crc kubenswrapper[4797]: I0930 18:20:46.595774 4797 generic.go:334] "Generic (PLEG): container finished" podID="93bccc4f-33ad-45b9-9549-20ba5484888f" containerID="29fbe5d95ab8779f63a1a3e1639f104c7eb66fdb87bcc29b2865e2df64dd7c68" exitCode=0 Sep 30 18:20:46 crc kubenswrapper[4797]: I0930 18:20:46.595894 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" event={"ID":"93bccc4f-33ad-45b9-9549-20ba5484888f","Type":"ContainerDied","Data":"29fbe5d95ab8779f63a1a3e1639f104c7eb66fdb87bcc29b2865e2df64dd7c68"} Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.169347 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.284467 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-nova-metadata-neutron-config-0\") pod \"93bccc4f-33ad-45b9-9549-20ba5484888f\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.284548 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-ssh-key\") pod \"93bccc4f-33ad-45b9-9549-20ba5484888f\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.284807 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-metadata-combined-ca-bundle\") pod \"93bccc4f-33ad-45b9-9549-20ba5484888f\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.284905 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"93bccc4f-33ad-45b9-9549-20ba5484888f\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.285026 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqqbv\" (UniqueName: \"kubernetes.io/projected/93bccc4f-33ad-45b9-9549-20ba5484888f-kube-api-access-pqqbv\") pod \"93bccc4f-33ad-45b9-9549-20ba5484888f\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.285066 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-inventory\") pod \"93bccc4f-33ad-45b9-9549-20ba5484888f\" (UID: \"93bccc4f-33ad-45b9-9549-20ba5484888f\") " Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.292125 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "93bccc4f-33ad-45b9-9549-20ba5484888f" (UID: "93bccc4f-33ad-45b9-9549-20ba5484888f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.310487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93bccc4f-33ad-45b9-9549-20ba5484888f-kube-api-access-pqqbv" (OuterVolumeSpecName: "kube-api-access-pqqbv") pod "93bccc4f-33ad-45b9-9549-20ba5484888f" (UID: "93bccc4f-33ad-45b9-9549-20ba5484888f"). InnerVolumeSpecName "kube-api-access-pqqbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.314494 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-inventory" (OuterVolumeSpecName: "inventory") pod "93bccc4f-33ad-45b9-9549-20ba5484888f" (UID: "93bccc4f-33ad-45b9-9549-20ba5484888f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.315625 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "93bccc4f-33ad-45b9-9549-20ba5484888f" (UID: "93bccc4f-33ad-45b9-9549-20ba5484888f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.318206 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "93bccc4f-33ad-45b9-9549-20ba5484888f" (UID: "93bccc4f-33ad-45b9-9549-20ba5484888f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.331707 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "93bccc4f-33ad-45b9-9549-20ba5484888f" (UID: "93bccc4f-33ad-45b9-9549-20ba5484888f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.387917 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.387959 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqqbv\" (UniqueName: \"kubernetes.io/projected/93bccc4f-33ad-45b9-9549-20ba5484888f-kube-api-access-pqqbv\") on node \"crc\" DevicePath \"\"" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.387972 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.387986 4797 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.387998 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.388007 4797 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bccc4f-33ad-45b9-9549-20ba5484888f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.622547 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" event={"ID":"93bccc4f-33ad-45b9-9549-20ba5484888f","Type":"ContainerDied","Data":"19395efe89b516698a9955df3c33d12fede7c54582704dbbfb9b9cb425f5dcf8"} Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.622607 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19395efe89b516698a9955df3c33d12fede7c54582704dbbfb9b9cb425f5dcf8" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.622623 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.748368 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94"] Sep 30 18:20:48 crc kubenswrapper[4797]: E0930 18:20:48.748946 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93bccc4f-33ad-45b9-9549-20ba5484888f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.749012 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="93bccc4f-33ad-45b9-9549-20ba5484888f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.749260 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="93bccc4f-33ad-45b9-9549-20ba5484888f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.750055 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.752400 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.752405 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.752629 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.752681 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.752753 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.764881 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94"] Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.900233 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.900633 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbxwl\" (UniqueName: \"kubernetes.io/projected/a008ddae-ddb5-47a3-9423-0da1ffdb8322-kube-api-access-xbxwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.900744 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.900771 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:48 crc kubenswrapper[4797]: I0930 18:20:48.900830 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.003266 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.003477 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbxwl\" (UniqueName: \"kubernetes.io/projected/a008ddae-ddb5-47a3-9423-0da1ffdb8322-kube-api-access-xbxwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.003649 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.003687 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.003748 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.013513 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.013553 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.013687 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.014472 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.029096 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbxwl\" (UniqueName: \"kubernetes.io/projected/a008ddae-ddb5-47a3-9423-0da1ffdb8322-kube-api-access-xbxwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vzb94\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.069721 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.239283 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:20:49 crc kubenswrapper[4797]: E0930 18:20:49.239758 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:20:49 crc kubenswrapper[4797]: I0930 18:20:49.650695 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94"] Sep 30 18:20:49 crc kubenswrapper[4797]: W0930 18:20:49.662589 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda008ddae_ddb5_47a3_9423_0da1ffdb8322.slice/crio-8390da088ea8e19e57c1da93df437f9f18c382ea7d4c488c43a78d5e2b66f74f WatchSource:0}: Error finding container 8390da088ea8e19e57c1da93df437f9f18c382ea7d4c488c43a78d5e2b66f74f: Status 404 returned error can't find the container with id 8390da088ea8e19e57c1da93df437f9f18c382ea7d4c488c43a78d5e2b66f74f Sep 30 18:20:50 crc kubenswrapper[4797]: I0930 18:20:50.652835 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" event={"ID":"a008ddae-ddb5-47a3-9423-0da1ffdb8322","Type":"ContainerStarted","Data":"c4ce100151ff478a273d335bd8f9de72802cf67437513f2eb78acf3150280c30"} Sep 30 18:20:50 crc kubenswrapper[4797]: I0930 18:20:50.653247 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" event={"ID":"a008ddae-ddb5-47a3-9423-0da1ffdb8322","Type":"ContainerStarted","Data":"8390da088ea8e19e57c1da93df437f9f18c382ea7d4c488c43a78d5e2b66f74f"} Sep 30 18:20:50 crc kubenswrapper[4797]: I0930 18:20:50.678374 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" podStartSLOduration=2.187078541 podStartE2EDuration="2.678351693s" podCreationTimestamp="2025-09-30 18:20:48 +0000 UTC" firstStartedPulling="2025-09-30 18:20:49.664693306 +0000 UTC m=+2300.187192544" lastFinishedPulling="2025-09-30 18:20:50.155966458 +0000 UTC m=+2300.678465696" observedRunningTime="2025-09-30 18:20:50.674260862 +0000 UTC m=+2301.196760140" watchObservedRunningTime="2025-09-30 18:20:50.678351693 +0000 UTC m=+2301.200850941" Sep 30 18:21:03 crc kubenswrapper[4797]: I0930 18:21:03.238770 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:21:03 crc kubenswrapper[4797]: E0930 18:21:03.240092 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:21:16 crc kubenswrapper[4797]: I0930 18:21:16.238654 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:21:16 crc kubenswrapper[4797]: E0930 18:21:16.239594 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:21:29 crc kubenswrapper[4797]: I0930 18:21:29.258665 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:21:29 crc kubenswrapper[4797]: E0930 18:21:29.259966 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:21:41 crc kubenswrapper[4797]: I0930 18:21:41.239128 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:21:41 crc kubenswrapper[4797]: E0930 18:21:41.240185 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:21:53 crc kubenswrapper[4797]: I0930 18:21:53.238875 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:21:53 crc kubenswrapper[4797]: E0930 18:21:53.240037 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:22:05 crc kubenswrapper[4797]: I0930 18:22:05.238315 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:22:05 crc kubenswrapper[4797]: E0930 18:22:05.239818 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:22:17 crc kubenswrapper[4797]: I0930 18:22:17.238476 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:22:17 crc kubenswrapper[4797]: E0930 18:22:17.239518 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:22:29 crc kubenswrapper[4797]: I0930 18:22:29.238630 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:22:29 crc kubenswrapper[4797]: E0930 18:22:29.239751 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:22:43 crc kubenswrapper[4797]: I0930 18:22:43.238112 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:22:43 crc kubenswrapper[4797]: E0930 18:22:43.238910 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:22:57 crc kubenswrapper[4797]: I0930 18:22:57.238309 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:22:57 crc kubenswrapper[4797]: E0930 18:22:57.239682 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:23:08 crc kubenswrapper[4797]: I0930 18:23:08.238014 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:23:08 crc kubenswrapper[4797]: E0930 18:23:08.238770 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:23:23 crc kubenswrapper[4797]: I0930 18:23:23.240787 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:23:23 crc kubenswrapper[4797]: E0930 18:23:23.241756 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:23:34 crc kubenswrapper[4797]: I0930 18:23:34.237920 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:23:34 crc kubenswrapper[4797]: E0930 18:23:34.238849 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:23:48 crc kubenswrapper[4797]: I0930 18:23:48.238328 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:23:48 crc kubenswrapper[4797]: E0930 18:23:48.239200 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:24:01 crc kubenswrapper[4797]: I0930 18:24:01.238270 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:24:01 crc kubenswrapper[4797]: E0930 18:24:01.239160 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:24:15 crc kubenswrapper[4797]: I0930 18:24:15.238764 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:24:15 crc kubenswrapper[4797]: E0930 18:24:15.239996 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.044081 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mqlrb"] Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.048143 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.070418 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mqlrb"] Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.119547 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-utilities\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.119998 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-catalog-content\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.120220 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwfzk\" (UniqueName: \"kubernetes.io/projected/964b2e3a-9c22-41a0-be49-c38394d3a002-kube-api-access-kwfzk\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.222042 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-utilities\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.222144 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-catalog-content\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.222196 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwfzk\" (UniqueName: \"kubernetes.io/projected/964b2e3a-9c22-41a0-be49-c38394d3a002-kube-api-access-kwfzk\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.222965 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-utilities\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.223048 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-catalog-content\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.248688 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwfzk\" (UniqueName: \"kubernetes.io/projected/964b2e3a-9c22-41a0-be49-c38394d3a002-kube-api-access-kwfzk\") pod \"certified-operators-mqlrb\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.406273 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:19 crc kubenswrapper[4797]: I0930 18:24:19.902857 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mqlrb"] Sep 30 18:24:20 crc kubenswrapper[4797]: I0930 18:24:20.098946 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerStarted","Data":"aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9"} Sep 30 18:24:20 crc kubenswrapper[4797]: I0930 18:24:20.100331 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerStarted","Data":"7e894a4e4858d6a74c6c5362e84af1fc2e661c9e6c94c45705f6b3afe5421681"} Sep 30 18:24:21 crc kubenswrapper[4797]: I0930 18:24:21.112012 4797 generic.go:334] "Generic (PLEG): container finished" podID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerID="aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9" exitCode=0 Sep 30 18:24:21 crc kubenswrapper[4797]: I0930 18:24:21.112075 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerDied","Data":"aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9"} Sep 30 18:24:21 crc kubenswrapper[4797]: I0930 18:24:21.114367 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:24:22 crc kubenswrapper[4797]: I0930 18:24:22.124599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerStarted","Data":"c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b"} Sep 30 18:24:23 crc kubenswrapper[4797]: I0930 18:24:23.145215 4797 generic.go:334] "Generic (PLEG): container finished" podID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerID="c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b" exitCode=0 Sep 30 18:24:23 crc kubenswrapper[4797]: I0930 18:24:23.145265 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerDied","Data":"c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b"} Sep 30 18:24:24 crc kubenswrapper[4797]: I0930 18:24:24.155844 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerStarted","Data":"9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f"} Sep 30 18:24:24 crc kubenswrapper[4797]: I0930 18:24:24.182961 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mqlrb" podStartSLOduration=2.525193405 podStartE2EDuration="5.182940313s" podCreationTimestamp="2025-09-30 18:24:19 +0000 UTC" firstStartedPulling="2025-09-30 18:24:21.114085961 +0000 UTC m=+2511.636585209" lastFinishedPulling="2025-09-30 18:24:23.771832869 +0000 UTC m=+2514.294332117" observedRunningTime="2025-09-30 18:24:24.176562569 +0000 UTC m=+2514.699061817" watchObservedRunningTime="2025-09-30 18:24:24.182940313 +0000 UTC m=+2514.705439551" Sep 30 18:24:26 crc kubenswrapper[4797]: I0930 18:24:26.239065 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:24:26 crc kubenswrapper[4797]: E0930 18:24:26.239730 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:24:29 crc kubenswrapper[4797]: I0930 18:24:29.408164 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:29 crc kubenswrapper[4797]: I0930 18:24:29.408713 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:29 crc kubenswrapper[4797]: I0930 18:24:29.470351 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:30 crc kubenswrapper[4797]: I0930 18:24:30.334567 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:30 crc kubenswrapper[4797]: I0930 18:24:30.432061 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mqlrb"] Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.284003 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mqlrb" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="registry-server" containerID="cri-o://9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f" gracePeriod=2 Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.784632 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.832035 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwfzk\" (UniqueName: \"kubernetes.io/projected/964b2e3a-9c22-41a0-be49-c38394d3a002-kube-api-access-kwfzk\") pod \"964b2e3a-9c22-41a0-be49-c38394d3a002\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.832258 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-utilities\") pod \"964b2e3a-9c22-41a0-be49-c38394d3a002\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.832842 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-catalog-content\") pod \"964b2e3a-9c22-41a0-be49-c38394d3a002\" (UID: \"964b2e3a-9c22-41a0-be49-c38394d3a002\") " Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.833690 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-utilities" (OuterVolumeSpecName: "utilities") pod "964b2e3a-9c22-41a0-be49-c38394d3a002" (UID: "964b2e3a-9c22-41a0-be49-c38394d3a002"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.844146 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964b2e3a-9c22-41a0-be49-c38394d3a002-kube-api-access-kwfzk" (OuterVolumeSpecName: "kube-api-access-kwfzk") pod "964b2e3a-9c22-41a0-be49-c38394d3a002" (UID: "964b2e3a-9c22-41a0-be49-c38394d3a002"). InnerVolumeSpecName "kube-api-access-kwfzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.935249 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwfzk\" (UniqueName: \"kubernetes.io/projected/964b2e3a-9c22-41a0-be49-c38394d3a002-kube-api-access-kwfzk\") on node \"crc\" DevicePath \"\"" Sep 30 18:24:32 crc kubenswrapper[4797]: I0930 18:24:32.935277 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.058655 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "964b2e3a-9c22-41a0-be49-c38394d3a002" (UID: "964b2e3a-9c22-41a0-be49-c38394d3a002"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.140349 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/964b2e3a-9c22-41a0-be49-c38394d3a002-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.299264 4797 generic.go:334] "Generic (PLEG): container finished" podID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerID="9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f" exitCode=0 Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.299351 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerDied","Data":"9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f"} Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.299401 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mqlrb" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.300912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mqlrb" event={"ID":"964b2e3a-9c22-41a0-be49-c38394d3a002","Type":"ContainerDied","Data":"7e894a4e4858d6a74c6c5362e84af1fc2e661c9e6c94c45705f6b3afe5421681"} Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.301064 4797 scope.go:117] "RemoveContainer" containerID="9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.357825 4797 scope.go:117] "RemoveContainer" containerID="c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.375494 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mqlrb"] Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.384999 4797 scope.go:117] "RemoveContainer" containerID="aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.388163 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mqlrb"] Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.442831 4797 scope.go:117] "RemoveContainer" containerID="9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f" Sep 30 18:24:33 crc kubenswrapper[4797]: E0930 18:24:33.443295 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f\": container with ID starting with 9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f not found: ID does not exist" containerID="9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.443326 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f"} err="failed to get container status \"9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f\": rpc error: code = NotFound desc = could not find container \"9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f\": container with ID starting with 9b9de76085caa42ce58fe2d16de462f4d93c96807e8a61dbafa91b02cd3ce46f not found: ID does not exist" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.443348 4797 scope.go:117] "RemoveContainer" containerID="c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b" Sep 30 18:24:33 crc kubenswrapper[4797]: E0930 18:24:33.444233 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b\": container with ID starting with c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b not found: ID does not exist" containerID="c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.444255 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b"} err="failed to get container status \"c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b\": rpc error: code = NotFound desc = could not find container \"c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b\": container with ID starting with c42799ec2641f3b7ba209df6159f31f0549871590082a7475c76fac73067008b not found: ID does not exist" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.444269 4797 scope.go:117] "RemoveContainer" containerID="aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9" Sep 30 18:24:33 crc kubenswrapper[4797]: E0930 18:24:33.444733 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9\": container with ID starting with aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9 not found: ID does not exist" containerID="aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9" Sep 30 18:24:33 crc kubenswrapper[4797]: I0930 18:24:33.444950 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9"} err="failed to get container status \"aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9\": rpc error: code = NotFound desc = could not find container \"aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9\": container with ID starting with aeb0e1804db307abfa958db683c0a9482507dc048cab894659d1eba8846256a9 not found: ID does not exist" Sep 30 18:24:34 crc kubenswrapper[4797]: I0930 18:24:34.259076 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" path="/var/lib/kubelet/pods/964b2e3a-9c22-41a0-be49-c38394d3a002/volumes" Sep 30 18:24:38 crc kubenswrapper[4797]: I0930 18:24:38.238968 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:24:38 crc kubenswrapper[4797]: E0930 18:24:38.239970 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:24:51 crc kubenswrapper[4797]: I0930 18:24:51.241859 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:24:51 crc kubenswrapper[4797]: I0930 18:24:51.515222 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"dc811f59fc9e121518578c6321b6597febeb9b12b073a392c42c4263e30bc31b"} Sep 30 18:25:35 crc kubenswrapper[4797]: I0930 18:25:35.039797 4797 generic.go:334] "Generic (PLEG): container finished" podID="a008ddae-ddb5-47a3-9423-0da1ffdb8322" containerID="c4ce100151ff478a273d335bd8f9de72802cf67437513f2eb78acf3150280c30" exitCode=0 Sep 30 18:25:35 crc kubenswrapper[4797]: I0930 18:25:35.039943 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" event={"ID":"a008ddae-ddb5-47a3-9423-0da1ffdb8322","Type":"ContainerDied","Data":"c4ce100151ff478a273d335bd8f9de72802cf67437513f2eb78acf3150280c30"} Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.500385 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.587793 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-ssh-key\") pod \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.587939 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-inventory\") pod \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.588116 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-combined-ca-bundle\") pod \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.588140 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbxwl\" (UniqueName: \"kubernetes.io/projected/a008ddae-ddb5-47a3-9423-0da1ffdb8322-kube-api-access-xbxwl\") pod \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.588178 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-secret-0\") pod \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\" (UID: \"a008ddae-ddb5-47a3-9423-0da1ffdb8322\") " Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.594117 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a008ddae-ddb5-47a3-9423-0da1ffdb8322-kube-api-access-xbxwl" (OuterVolumeSpecName: "kube-api-access-xbxwl") pod "a008ddae-ddb5-47a3-9423-0da1ffdb8322" (UID: "a008ddae-ddb5-47a3-9423-0da1ffdb8322"). InnerVolumeSpecName "kube-api-access-xbxwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.594550 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a008ddae-ddb5-47a3-9423-0da1ffdb8322" (UID: "a008ddae-ddb5-47a3-9423-0da1ffdb8322"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.626280 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a008ddae-ddb5-47a3-9423-0da1ffdb8322" (UID: "a008ddae-ddb5-47a3-9423-0da1ffdb8322"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.630373 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a008ddae-ddb5-47a3-9423-0da1ffdb8322" (UID: "a008ddae-ddb5-47a3-9423-0da1ffdb8322"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.642205 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-inventory" (OuterVolumeSpecName: "inventory") pod "a008ddae-ddb5-47a3-9423-0da1ffdb8322" (UID: "a008ddae-ddb5-47a3-9423-0da1ffdb8322"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.690669 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.690747 4797 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.690778 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbxwl\" (UniqueName: \"kubernetes.io/projected/a008ddae-ddb5-47a3-9423-0da1ffdb8322-kube-api-access-xbxwl\") on node \"crc\" DevicePath \"\"" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.690804 4797 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:25:36 crc kubenswrapper[4797]: I0930 18:25:36.690826 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a008ddae-ddb5-47a3-9423-0da1ffdb8322-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.065061 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" event={"ID":"a008ddae-ddb5-47a3-9423-0da1ffdb8322","Type":"ContainerDied","Data":"8390da088ea8e19e57c1da93df437f9f18c382ea7d4c488c43a78d5e2b66f74f"} Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.065112 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8390da088ea8e19e57c1da93df437f9f18c382ea7d4c488c43a78d5e2b66f74f" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.065148 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vzb94" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.198506 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66"] Sep 30 18:25:37 crc kubenswrapper[4797]: E0930 18:25:37.199062 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="extract-content" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.199089 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="extract-content" Sep 30 18:25:37 crc kubenswrapper[4797]: E0930 18:25:37.199116 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="registry-server" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.199127 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="registry-server" Sep 30 18:25:37 crc kubenswrapper[4797]: E0930 18:25:37.199151 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="extract-utilities" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.199165 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="extract-utilities" Sep 30 18:25:37 crc kubenswrapper[4797]: E0930 18:25:37.199195 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a008ddae-ddb5-47a3-9423-0da1ffdb8322" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.199207 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a008ddae-ddb5-47a3-9423-0da1ffdb8322" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.199550 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a008ddae-ddb5-47a3-9423-0da1ffdb8322" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.199595 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="964b2e3a-9c22-41a0-be49-c38394d3a002" containerName="registry-server" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.200608 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.202976 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.203031 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.202981 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.203270 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.203281 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.203554 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.203638 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.211232 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66"] Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302027 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302114 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302151 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302174 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302320 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302696 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.302800 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.303014 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s6gh\" (UniqueName: \"kubernetes.io/projected/cd2701f6-0eb3-4359-a16d-7435179896c0-kube-api-access-2s6gh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.303098 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.410685 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s6gh\" (UniqueName: \"kubernetes.io/projected/cd2701f6-0eb3-4359-a16d-7435179896c0-kube-api-access-2s6gh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.411133 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.411330 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.411522 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.411688 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.411845 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.412097 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.412462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.412679 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.415890 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.420078 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.425862 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.425903 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.438430 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.439525 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.445670 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s6gh\" (UniqueName: \"kubernetes.io/projected/cd2701f6-0eb3-4359-a16d-7435179896c0-kube-api-access-2s6gh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.446411 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.446890 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-92w66\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:37 crc kubenswrapper[4797]: I0930 18:25:37.536067 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:25:38 crc kubenswrapper[4797]: I0930 18:25:38.167156 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66"] Sep 30 18:25:39 crc kubenswrapper[4797]: I0930 18:25:39.087004 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" event={"ID":"cd2701f6-0eb3-4359-a16d-7435179896c0","Type":"ContainerStarted","Data":"6b7be5ed0b88265983a9bfc11bdb07937f4e9096b4937c5847a5302a94dcc818"} Sep 30 18:25:39 crc kubenswrapper[4797]: I0930 18:25:39.087532 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" event={"ID":"cd2701f6-0eb3-4359-a16d-7435179896c0","Type":"ContainerStarted","Data":"c5f2cd379459ae6ffa15b2ec2f438a16181afa5a151f5b45b10e29addaf99a2d"} Sep 30 18:25:39 crc kubenswrapper[4797]: I0930 18:25:39.117521 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" podStartSLOduration=1.6865480050000001 podStartE2EDuration="2.117496734s" podCreationTimestamp="2025-09-30 18:25:37 +0000 UTC" firstStartedPulling="2025-09-30 18:25:38.178846173 +0000 UTC m=+2588.701345411" lastFinishedPulling="2025-09-30 18:25:38.609794902 +0000 UTC m=+2589.132294140" observedRunningTime="2025-09-30 18:25:39.113861894 +0000 UTC m=+2589.636361132" watchObservedRunningTime="2025-09-30 18:25:39.117496734 +0000 UTC m=+2589.639995982" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.689670 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6c5sm"] Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.694255 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.705917 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6c5sm"] Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.729782 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-utilities\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.729877 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cjr\" (UniqueName: \"kubernetes.io/projected/d409e626-7571-48f6-b2ec-4b262a0b3736-kube-api-access-z7cjr\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.729901 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-catalog-content\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.831795 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cjr\" (UniqueName: \"kubernetes.io/projected/d409e626-7571-48f6-b2ec-4b262a0b3736-kube-api-access-z7cjr\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.831862 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-catalog-content\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.832098 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-utilities\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.833029 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-catalog-content\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.833980 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-utilities\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:38 crc kubenswrapper[4797]: I0930 18:26:38.863808 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cjr\" (UniqueName: \"kubernetes.io/projected/d409e626-7571-48f6-b2ec-4b262a0b3736-kube-api-access-z7cjr\") pod \"redhat-operators-6c5sm\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:39 crc kubenswrapper[4797]: I0930 18:26:39.068691 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:39 crc kubenswrapper[4797]: I0930 18:26:39.559562 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6c5sm"] Sep 30 18:26:39 crc kubenswrapper[4797]: I0930 18:26:39.793471 4797 generic.go:334] "Generic (PLEG): container finished" podID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerID="e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9" exitCode=0 Sep 30 18:26:39 crc kubenswrapper[4797]: I0930 18:26:39.793529 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerDied","Data":"e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9"} Sep 30 18:26:39 crc kubenswrapper[4797]: I0930 18:26:39.793880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerStarted","Data":"1967e9937a76791fb47dcbddb7c70db4d3f80fe946ea1ed40d95508416cb9e2d"} Sep 30 18:26:41 crc kubenswrapper[4797]: I0930 18:26:41.820090 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerStarted","Data":"8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf"} Sep 30 18:26:44 crc kubenswrapper[4797]: I0930 18:26:44.857805 4797 generic.go:334] "Generic (PLEG): container finished" podID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerID="8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf" exitCode=0 Sep 30 18:26:44 crc kubenswrapper[4797]: I0930 18:26:44.857853 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerDied","Data":"8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf"} Sep 30 18:26:45 crc kubenswrapper[4797]: I0930 18:26:45.872301 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerStarted","Data":"ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2"} Sep 30 18:26:45 crc kubenswrapper[4797]: I0930 18:26:45.912878 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6c5sm" podStartSLOduration=2.182880071 podStartE2EDuration="7.912857037s" podCreationTimestamp="2025-09-30 18:26:38 +0000 UTC" firstStartedPulling="2025-09-30 18:26:39.794832687 +0000 UTC m=+2650.317331915" lastFinishedPulling="2025-09-30 18:26:45.524809603 +0000 UTC m=+2656.047308881" observedRunningTime="2025-09-30 18:26:45.903216764 +0000 UTC m=+2656.425716052" watchObservedRunningTime="2025-09-30 18:26:45.912857037 +0000 UTC m=+2656.435356285" Sep 30 18:26:49 crc kubenswrapper[4797]: I0930 18:26:49.069718 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:49 crc kubenswrapper[4797]: I0930 18:26:49.070191 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:50 crc kubenswrapper[4797]: I0930 18:26:50.148091 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6c5sm" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="registry-server" probeResult="failure" output=< Sep 30 18:26:50 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 18:26:50 crc kubenswrapper[4797]: > Sep 30 18:26:59 crc kubenswrapper[4797]: I0930 18:26:59.138204 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:59 crc kubenswrapper[4797]: I0930 18:26:59.222208 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:26:59 crc kubenswrapper[4797]: I0930 18:26:59.387489 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6c5sm"] Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.050362 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6c5sm" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="registry-server" containerID="cri-o://ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2" gracePeriod=2 Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.479349 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.647575 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-catalog-content\") pod \"d409e626-7571-48f6-b2ec-4b262a0b3736\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.647763 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-utilities\") pod \"d409e626-7571-48f6-b2ec-4b262a0b3736\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.647848 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cjr\" (UniqueName: \"kubernetes.io/projected/d409e626-7571-48f6-b2ec-4b262a0b3736-kube-api-access-z7cjr\") pod \"d409e626-7571-48f6-b2ec-4b262a0b3736\" (UID: \"d409e626-7571-48f6-b2ec-4b262a0b3736\") " Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.648855 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-utilities" (OuterVolumeSpecName: "utilities") pod "d409e626-7571-48f6-b2ec-4b262a0b3736" (UID: "d409e626-7571-48f6-b2ec-4b262a0b3736"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.655735 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d409e626-7571-48f6-b2ec-4b262a0b3736-kube-api-access-z7cjr" (OuterVolumeSpecName: "kube-api-access-z7cjr") pod "d409e626-7571-48f6-b2ec-4b262a0b3736" (UID: "d409e626-7571-48f6-b2ec-4b262a0b3736"). InnerVolumeSpecName "kube-api-access-z7cjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.748486 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d409e626-7571-48f6-b2ec-4b262a0b3736" (UID: "d409e626-7571-48f6-b2ec-4b262a0b3736"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.749773 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7cjr\" (UniqueName: \"kubernetes.io/projected/d409e626-7571-48f6-b2ec-4b262a0b3736-kube-api-access-z7cjr\") on node \"crc\" DevicePath \"\"" Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.749797 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:27:01 crc kubenswrapper[4797]: I0930 18:27:01.749806 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d409e626-7571-48f6-b2ec-4b262a0b3736-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.061084 4797 generic.go:334] "Generic (PLEG): container finished" podID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerID="ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2" exitCode=0 Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.061139 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerDied","Data":"ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2"} Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.061163 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c5sm" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.061369 4797 scope.go:117] "RemoveContainer" containerID="ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.061355 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c5sm" event={"ID":"d409e626-7571-48f6-b2ec-4b262a0b3736","Type":"ContainerDied","Data":"1967e9937a76791fb47dcbddb7c70db4d3f80fe946ea1ed40d95508416cb9e2d"} Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.082950 4797 scope.go:117] "RemoveContainer" containerID="8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.090570 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6c5sm"] Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.098988 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6c5sm"] Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.108597 4797 scope.go:117] "RemoveContainer" containerID="e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.154858 4797 scope.go:117] "RemoveContainer" containerID="ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2" Sep 30 18:27:02 crc kubenswrapper[4797]: E0930 18:27:02.155299 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2\": container with ID starting with ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2 not found: ID does not exist" containerID="ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.155350 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2"} err="failed to get container status \"ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2\": rpc error: code = NotFound desc = could not find container \"ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2\": container with ID starting with ecfdd923bf4f406224e5fc7b259c8e97557217a054a5d0a202e29510dca31dd2 not found: ID does not exist" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.155382 4797 scope.go:117] "RemoveContainer" containerID="8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf" Sep 30 18:27:02 crc kubenswrapper[4797]: E0930 18:27:02.155868 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf\": container with ID starting with 8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf not found: ID does not exist" containerID="8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.155924 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf"} err="failed to get container status \"8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf\": rpc error: code = NotFound desc = could not find container \"8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf\": container with ID starting with 8e3b571132f858d125e7b1c841e0f49c6d6d7a936d42d8b423f4cb5d2c49dedf not found: ID does not exist" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.155950 4797 scope.go:117] "RemoveContainer" containerID="e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9" Sep 30 18:27:02 crc kubenswrapper[4797]: E0930 18:27:02.156208 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9\": container with ID starting with e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9 not found: ID does not exist" containerID="e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.156241 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9"} err="failed to get container status \"e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9\": rpc error: code = NotFound desc = could not find container \"e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9\": container with ID starting with e7af57de78ce2e2c6795c61e3072fd64281adee3fc4b7503817f59a7eaa46ee9 not found: ID does not exist" Sep 30 18:27:02 crc kubenswrapper[4797]: I0930 18:27:02.248812 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" path="/var/lib/kubelet/pods/d409e626-7571-48f6-b2ec-4b262a0b3736/volumes" Sep 30 18:27:14 crc kubenswrapper[4797]: I0930 18:27:14.192908 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:27:14 crc kubenswrapper[4797]: I0930 18:27:14.194276 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:27:44 crc kubenswrapper[4797]: I0930 18:27:44.191910 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:27:44 crc kubenswrapper[4797]: I0930 18:27:44.192631 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.191946 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.192506 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.192554 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.193385 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc811f59fc9e121518578c6321b6597febeb9b12b073a392c42c4263e30bc31b"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.193474 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://dc811f59fc9e121518578c6321b6597febeb9b12b073a392c42c4263e30bc31b" gracePeriod=600 Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.893097 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="dc811f59fc9e121518578c6321b6597febeb9b12b073a392c42c4263e30bc31b" exitCode=0 Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.893218 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"dc811f59fc9e121518578c6321b6597febeb9b12b073a392c42c4263e30bc31b"} Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.893576 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478"} Sep 30 18:28:14 crc kubenswrapper[4797]: I0930 18:28:14.893597 4797 scope.go:117] "RemoveContainer" containerID="817be7b713b97fb7469028ed51a4c0622412de203f6b22df71c2dbea0a52aa18" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.576555 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gktqb"] Sep 30 18:28:35 crc kubenswrapper[4797]: E0930 18:28:35.577664 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="registry-server" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.577684 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="registry-server" Sep 30 18:28:35 crc kubenswrapper[4797]: E0930 18:28:35.577727 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="extract-utilities" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.577740 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="extract-utilities" Sep 30 18:28:35 crc kubenswrapper[4797]: E0930 18:28:35.577782 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="extract-content" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.577796 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="extract-content" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.578174 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="d409e626-7571-48f6-b2ec-4b262a0b3736" containerName="registry-server" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.584821 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.594993 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gktqb"] Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.717236 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-utilities\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.717285 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-catalog-content\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.717355 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whk6k\" (UniqueName: \"kubernetes.io/projected/47aefebb-98ca-43d6-a248-4b3552ce6df3-kube-api-access-whk6k\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.818705 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-utilities\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.818755 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-catalog-content\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.818827 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whk6k\" (UniqueName: \"kubernetes.io/projected/47aefebb-98ca-43d6-a248-4b3552ce6df3-kube-api-access-whk6k\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.819324 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-utilities\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.819376 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-catalog-content\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.839935 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whk6k\" (UniqueName: \"kubernetes.io/projected/47aefebb-98ca-43d6-a248-4b3552ce6df3-kube-api-access-whk6k\") pod \"community-operators-gktqb\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:35 crc kubenswrapper[4797]: I0930 18:28:35.919856 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:36 crc kubenswrapper[4797]: I0930 18:28:36.440203 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gktqb"] Sep 30 18:28:37 crc kubenswrapper[4797]: I0930 18:28:37.152101 4797 generic.go:334] "Generic (PLEG): container finished" podID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerID="e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98" exitCode=0 Sep 30 18:28:37 crc kubenswrapper[4797]: I0930 18:28:37.152226 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gktqb" event={"ID":"47aefebb-98ca-43d6-a248-4b3552ce6df3","Type":"ContainerDied","Data":"e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98"} Sep 30 18:28:37 crc kubenswrapper[4797]: I0930 18:28:37.152429 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gktqb" event={"ID":"47aefebb-98ca-43d6-a248-4b3552ce6df3","Type":"ContainerStarted","Data":"b542bd02001efd97a3b6261bed302448db95e31bddd3ea88d22ac10cfd70d1a6"} Sep 30 18:28:39 crc kubenswrapper[4797]: I0930 18:28:39.185012 4797 generic.go:334] "Generic (PLEG): container finished" podID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerID="0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6" exitCode=0 Sep 30 18:28:39 crc kubenswrapper[4797]: I0930 18:28:39.185244 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gktqb" event={"ID":"47aefebb-98ca-43d6-a248-4b3552ce6df3","Type":"ContainerDied","Data":"0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6"} Sep 30 18:28:40 crc kubenswrapper[4797]: I0930 18:28:40.201019 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gktqb" event={"ID":"47aefebb-98ca-43d6-a248-4b3552ce6df3","Type":"ContainerStarted","Data":"d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508"} Sep 30 18:28:40 crc kubenswrapper[4797]: I0930 18:28:40.237939 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gktqb" podStartSLOduration=2.648278769 podStartE2EDuration="5.237916738s" podCreationTimestamp="2025-09-30 18:28:35 +0000 UTC" firstStartedPulling="2025-09-30 18:28:37.154364483 +0000 UTC m=+2767.676863741" lastFinishedPulling="2025-09-30 18:28:39.744002472 +0000 UTC m=+2770.266501710" observedRunningTime="2025-09-30 18:28:40.227526844 +0000 UTC m=+2770.750026102" watchObservedRunningTime="2025-09-30 18:28:40.237916738 +0000 UTC m=+2770.760415986" Sep 30 18:28:45 crc kubenswrapper[4797]: I0930 18:28:45.920594 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:45 crc kubenswrapper[4797]: I0930 18:28:45.921289 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:45 crc kubenswrapper[4797]: I0930 18:28:45.993077 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:46 crc kubenswrapper[4797]: I0930 18:28:46.329137 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:46 crc kubenswrapper[4797]: I0930 18:28:46.378296 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gktqb"] Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.271362 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gktqb" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="registry-server" containerID="cri-o://d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508" gracePeriod=2 Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.752272 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.908162 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-catalog-content\") pod \"47aefebb-98ca-43d6-a248-4b3552ce6df3\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.908257 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-utilities\") pod \"47aefebb-98ca-43d6-a248-4b3552ce6df3\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.908320 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whk6k\" (UniqueName: \"kubernetes.io/projected/47aefebb-98ca-43d6-a248-4b3552ce6df3-kube-api-access-whk6k\") pod \"47aefebb-98ca-43d6-a248-4b3552ce6df3\" (UID: \"47aefebb-98ca-43d6-a248-4b3552ce6df3\") " Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.909174 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-utilities" (OuterVolumeSpecName: "utilities") pod "47aefebb-98ca-43d6-a248-4b3552ce6df3" (UID: "47aefebb-98ca-43d6-a248-4b3552ce6df3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.914012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47aefebb-98ca-43d6-a248-4b3552ce6df3-kube-api-access-whk6k" (OuterVolumeSpecName: "kube-api-access-whk6k") pod "47aefebb-98ca-43d6-a248-4b3552ce6df3" (UID: "47aefebb-98ca-43d6-a248-4b3552ce6df3"). InnerVolumeSpecName "kube-api-access-whk6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:28:48 crc kubenswrapper[4797]: I0930 18:28:48.961566 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47aefebb-98ca-43d6-a248-4b3552ce6df3" (UID: "47aefebb-98ca-43d6-a248-4b3552ce6df3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.011087 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.011118 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aefebb-98ca-43d6-a248-4b3552ce6df3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.011130 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whk6k\" (UniqueName: \"kubernetes.io/projected/47aefebb-98ca-43d6-a248-4b3552ce6df3-kube-api-access-whk6k\") on node \"crc\" DevicePath \"\"" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.283111 4797 generic.go:334] "Generic (PLEG): container finished" podID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerID="d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508" exitCode=0 Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.283167 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gktqb" event={"ID":"47aefebb-98ca-43d6-a248-4b3552ce6df3","Type":"ContainerDied","Data":"d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508"} Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.283541 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gktqb" event={"ID":"47aefebb-98ca-43d6-a248-4b3552ce6df3","Type":"ContainerDied","Data":"b542bd02001efd97a3b6261bed302448db95e31bddd3ea88d22ac10cfd70d1a6"} Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.283579 4797 scope.go:117] "RemoveContainer" containerID="d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.283197 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gktqb" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.326872 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gktqb"] Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.334729 4797 scope.go:117] "RemoveContainer" containerID="0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.335112 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gktqb"] Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.367185 4797 scope.go:117] "RemoveContainer" containerID="e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.436668 4797 scope.go:117] "RemoveContainer" containerID="d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508" Sep 30 18:28:49 crc kubenswrapper[4797]: E0930 18:28:49.437135 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508\": container with ID starting with d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508 not found: ID does not exist" containerID="d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.437172 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508"} err="failed to get container status \"d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508\": rpc error: code = NotFound desc = could not find container \"d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508\": container with ID starting with d7f226db1cf4dc0e38d6bd84fa23750c93460d2fc0c908fbc4d71e12a8a72508 not found: ID does not exist" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.437199 4797 scope.go:117] "RemoveContainer" containerID="0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6" Sep 30 18:28:49 crc kubenswrapper[4797]: E0930 18:28:49.437390 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6\": container with ID starting with 0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6 not found: ID does not exist" containerID="0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.437417 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6"} err="failed to get container status \"0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6\": rpc error: code = NotFound desc = could not find container \"0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6\": container with ID starting with 0f4a34fb949ac5d4076815582fd7929ef62566ec6ec983f68a36f3d13bad42e6 not found: ID does not exist" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.437454 4797 scope.go:117] "RemoveContainer" containerID="e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98" Sep 30 18:28:49 crc kubenswrapper[4797]: E0930 18:28:49.437647 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98\": container with ID starting with e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98 not found: ID does not exist" containerID="e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98" Sep 30 18:28:49 crc kubenswrapper[4797]: I0930 18:28:49.437680 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98"} err="failed to get container status \"e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98\": rpc error: code = NotFound desc = could not find container \"e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98\": container with ID starting with e3fc2714d761a29177be32c6e6a1b3ffa9ed09ceab14efbc4c620d1030891e98 not found: ID does not exist" Sep 30 18:28:50 crc kubenswrapper[4797]: I0930 18:28:50.250480 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" path="/var/lib/kubelet/pods/47aefebb-98ca-43d6-a248-4b3552ce6df3/volumes" Sep 30 18:29:34 crc kubenswrapper[4797]: I0930 18:29:34.800556 4797 generic.go:334] "Generic (PLEG): container finished" podID="cd2701f6-0eb3-4359-a16d-7435179896c0" containerID="6b7be5ed0b88265983a9bfc11bdb07937f4e9096b4937c5847a5302a94dcc818" exitCode=0 Sep 30 18:29:34 crc kubenswrapper[4797]: I0930 18:29:34.800625 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" event={"ID":"cd2701f6-0eb3-4359-a16d-7435179896c0","Type":"ContainerDied","Data":"6b7be5ed0b88265983a9bfc11bdb07937f4e9096b4937c5847a5302a94dcc818"} Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.396218 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.486671 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-ssh-key\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.486717 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-inventory\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.516857 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-inventory" (OuterVolumeSpecName: "inventory") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.534997 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588057 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-extra-config-0\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588144 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-combined-ca-bundle\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588294 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-0\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588339 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s6gh\" (UniqueName: \"kubernetes.io/projected/cd2701f6-0eb3-4359-a16d-7435179896c0-kube-api-access-2s6gh\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588364 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-0\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588395 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-1\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.588663 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-1\") pod \"cd2701f6-0eb3-4359-a16d-7435179896c0\" (UID: \"cd2701f6-0eb3-4359-a16d-7435179896c0\") " Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.589500 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.589520 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.592921 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.593487 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd2701f6-0eb3-4359-a16d-7435179896c0-kube-api-access-2s6gh" (OuterVolumeSpecName: "kube-api-access-2s6gh") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "kube-api-access-2s6gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.614525 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.621235 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.621710 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.627677 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.639502 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "cd2701f6-0eb3-4359-a16d-7435179896c0" (UID: "cd2701f6-0eb3-4359-a16d-7435179896c0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691525 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691563 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s6gh\" (UniqueName: \"kubernetes.io/projected/cd2701f6-0eb3-4359-a16d-7435179896c0-kube-api-access-2s6gh\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691573 4797 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691581 4797 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691589 4797 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691598 4797 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.691606 4797 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2701f6-0eb3-4359-a16d-7435179896c0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.828458 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" event={"ID":"cd2701f6-0eb3-4359-a16d-7435179896c0","Type":"ContainerDied","Data":"c5f2cd379459ae6ffa15b2ec2f438a16181afa5a151f5b45b10e29addaf99a2d"} Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.828503 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f2cd379459ae6ffa15b2ec2f438a16181afa5a151f5b45b10e29addaf99a2d" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.828575 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-92w66" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.959745 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm"] Sep 30 18:29:36 crc kubenswrapper[4797]: E0930 18:29:36.960545 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="extract-utilities" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.960576 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="extract-utilities" Sep 30 18:29:36 crc kubenswrapper[4797]: E0930 18:29:36.960599 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="extract-content" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.960612 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="extract-content" Sep 30 18:29:36 crc kubenswrapper[4797]: E0930 18:29:36.960645 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="registry-server" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.960660 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="registry-server" Sep 30 18:29:36 crc kubenswrapper[4797]: E0930 18:29:36.960695 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd2701f6-0eb3-4359-a16d-7435179896c0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.960708 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd2701f6-0eb3-4359-a16d-7435179896c0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.961074 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd2701f6-0eb3-4359-a16d-7435179896c0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.961104 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aefebb-98ca-43d6-a248-4b3552ce6df3" containerName="registry-server" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.962192 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.965292 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.965350 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.965881 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.966179 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48g9t" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.966571 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.985927 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm"] Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.998670 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.998858 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.998910 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.999068 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.999109 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.999258 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:36 crc kubenswrapper[4797]: I0930 18:29:36.999534 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6gpj\" (UniqueName: \"kubernetes.io/projected/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-kube-api-access-v6gpj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.102053 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.102109 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.102168 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.102188 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.102218 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.102303 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6gpj\" (UniqueName: \"kubernetes.io/projected/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-kube-api-access-v6gpj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.103592 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.107368 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.107940 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.108234 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.108776 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.109474 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.109867 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.128143 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6gpj\" (UniqueName: \"kubernetes.io/projected/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-kube-api-access-v6gpj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.299354 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.742415 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm"] Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.746926 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:29:37 crc kubenswrapper[4797]: I0930 18:29:37.842626 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" event={"ID":"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3","Type":"ContainerStarted","Data":"feff5502d98f404dc64458a8afa08764ffd547fa9ef2c2f10dc7438765db546c"} Sep 30 18:29:38 crc kubenswrapper[4797]: I0930 18:29:38.855210 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" event={"ID":"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3","Type":"ContainerStarted","Data":"6d9817536b05ccb769f8d2a205eb3539e9c7684bd246645852cbbab09067514c"} Sep 30 18:29:38 crc kubenswrapper[4797]: I0930 18:29:38.886582 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" podStartSLOduration=2.365915364 podStartE2EDuration="2.886561692s" podCreationTimestamp="2025-09-30 18:29:36 +0000 UTC" firstStartedPulling="2025-09-30 18:29:37.746745135 +0000 UTC m=+2828.269244363" lastFinishedPulling="2025-09-30 18:29:38.267391443 +0000 UTC m=+2828.789890691" observedRunningTime="2025-09-30 18:29:38.880724542 +0000 UTC m=+2829.403223820" watchObservedRunningTime="2025-09-30 18:29:38.886561692 +0000 UTC m=+2829.409060960" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.146406 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8"] Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.148547 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.151183 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.163639 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8"] Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.168868 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.215080 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt875\" (UniqueName: \"kubernetes.io/projected/1c273c7c-5815-4b45-b341-67a5cb16a202-kube-api-access-wt875\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.215391 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c273c7c-5815-4b45-b341-67a5cb16a202-secret-volume\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.215795 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c273c7c-5815-4b45-b341-67a5cb16a202-config-volume\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.317971 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c273c7c-5815-4b45-b341-67a5cb16a202-secret-volume\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.318121 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c273c7c-5815-4b45-b341-67a5cb16a202-config-volume\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.318210 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt875\" (UniqueName: \"kubernetes.io/projected/1c273c7c-5815-4b45-b341-67a5cb16a202-kube-api-access-wt875\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.318953 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c273c7c-5815-4b45-b341-67a5cb16a202-config-volume\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.323627 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c273c7c-5815-4b45-b341-67a5cb16a202-secret-volume\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.353060 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt875\" (UniqueName: \"kubernetes.io/projected/1c273c7c-5815-4b45-b341-67a5cb16a202-kube-api-access-wt875\") pod \"collect-profiles-29320950-g7wk8\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.476127 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:00 crc kubenswrapper[4797]: I0930 18:30:00.933103 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8"] Sep 30 18:30:00 crc kubenswrapper[4797]: W0930 18:30:00.939837 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c273c7c_5815_4b45_b341_67a5cb16a202.slice/crio-63b7e04e299e6eb32aeebd3942cf598aa032691dc4c7bd6d26044cab47c7e5db WatchSource:0}: Error finding container 63b7e04e299e6eb32aeebd3942cf598aa032691dc4c7bd6d26044cab47c7e5db: Status 404 returned error can't find the container with id 63b7e04e299e6eb32aeebd3942cf598aa032691dc4c7bd6d26044cab47c7e5db Sep 30 18:30:01 crc kubenswrapper[4797]: I0930 18:30:01.103977 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" event={"ID":"1c273c7c-5815-4b45-b341-67a5cb16a202","Type":"ContainerStarted","Data":"63b7e04e299e6eb32aeebd3942cf598aa032691dc4c7bd6d26044cab47c7e5db"} Sep 30 18:30:02 crc kubenswrapper[4797]: I0930 18:30:02.115265 4797 generic.go:334] "Generic (PLEG): container finished" podID="1c273c7c-5815-4b45-b341-67a5cb16a202" containerID="4bc69441cdfad1fb9274e4975522fee6a1e17eacecc77a26209380554c43b814" exitCode=0 Sep 30 18:30:02 crc kubenswrapper[4797]: I0930 18:30:02.115326 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" event={"ID":"1c273c7c-5815-4b45-b341-67a5cb16a202","Type":"ContainerDied","Data":"4bc69441cdfad1fb9274e4975522fee6a1e17eacecc77a26209380554c43b814"} Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.469490 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.480400 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt875\" (UniqueName: \"kubernetes.io/projected/1c273c7c-5815-4b45-b341-67a5cb16a202-kube-api-access-wt875\") pod \"1c273c7c-5815-4b45-b341-67a5cb16a202\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.480497 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c273c7c-5815-4b45-b341-67a5cb16a202-secret-volume\") pod \"1c273c7c-5815-4b45-b341-67a5cb16a202\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.480696 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c273c7c-5815-4b45-b341-67a5cb16a202-config-volume\") pod \"1c273c7c-5815-4b45-b341-67a5cb16a202\" (UID: \"1c273c7c-5815-4b45-b341-67a5cb16a202\") " Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.481389 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c273c7c-5815-4b45-b341-67a5cb16a202-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c273c7c-5815-4b45-b341-67a5cb16a202" (UID: "1c273c7c-5815-4b45-b341-67a5cb16a202"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.486677 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c273c7c-5815-4b45-b341-67a5cb16a202-kube-api-access-wt875" (OuterVolumeSpecName: "kube-api-access-wt875") pod "1c273c7c-5815-4b45-b341-67a5cb16a202" (UID: "1c273c7c-5815-4b45-b341-67a5cb16a202"). InnerVolumeSpecName "kube-api-access-wt875". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.497045 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c273c7c-5815-4b45-b341-67a5cb16a202-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c273c7c-5815-4b45-b341-67a5cb16a202" (UID: "1c273c7c-5815-4b45-b341-67a5cb16a202"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.583697 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt875\" (UniqueName: \"kubernetes.io/projected/1c273c7c-5815-4b45-b341-67a5cb16a202-kube-api-access-wt875\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.583725 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c273c7c-5815-4b45-b341-67a5cb16a202-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:03 crc kubenswrapper[4797]: I0930 18:30:03.583735 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c273c7c-5815-4b45-b341-67a5cb16a202-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:30:04 crc kubenswrapper[4797]: I0930 18:30:04.160792 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" Sep 30 18:30:04 crc kubenswrapper[4797]: I0930 18:30:04.160932 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8" event={"ID":"1c273c7c-5815-4b45-b341-67a5cb16a202","Type":"ContainerDied","Data":"63b7e04e299e6eb32aeebd3942cf598aa032691dc4c7bd6d26044cab47c7e5db"} Sep 30 18:30:04 crc kubenswrapper[4797]: I0930 18:30:04.161039 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63b7e04e299e6eb32aeebd3942cf598aa032691dc4c7bd6d26044cab47c7e5db" Sep 30 18:30:04 crc kubenswrapper[4797]: I0930 18:30:04.566336 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr"] Sep 30 18:30:04 crc kubenswrapper[4797]: I0930 18:30:04.585297 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-9m9zr"] Sep 30 18:30:06 crc kubenswrapper[4797]: I0930 18:30:06.258033 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542a58fe-9654-424c-90dc-6d073f486328" path="/var/lib/kubelet/pods/542a58fe-9654-424c-90dc-6d073f486328/volumes" Sep 30 18:30:14 crc kubenswrapper[4797]: I0930 18:30:14.192935 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:30:14 crc kubenswrapper[4797]: I0930 18:30:14.193603 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:30:16 crc kubenswrapper[4797]: I0930 18:30:16.784371 4797 scope.go:117] "RemoveContainer" containerID="68560197d586281eddf9b016d42438c6820920ce9461182ad4bb1db69e97865f" Sep 30 18:30:44 crc kubenswrapper[4797]: I0930 18:30:44.191806 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:30:44 crc kubenswrapper[4797]: I0930 18:30:44.192334 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.191867 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.192733 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.192808 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.193885 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.193990 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" gracePeriod=600 Sep 30 18:31:14 crc kubenswrapper[4797]: E0930 18:31:14.327949 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.960376 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" exitCode=0 Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.960427 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478"} Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.960841 4797 scope.go:117] "RemoveContainer" containerID="dc811f59fc9e121518578c6321b6597febeb9b12b073a392c42c4263e30bc31b" Sep 30 18:31:14 crc kubenswrapper[4797]: I0930 18:31:14.962482 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:31:14 crc kubenswrapper[4797]: E0930 18:31:14.963131 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:31:27 crc kubenswrapper[4797]: I0930 18:31:27.238953 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:31:27 crc kubenswrapper[4797]: E0930 18:31:27.240250 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:31:39 crc kubenswrapper[4797]: I0930 18:31:39.238226 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:31:39 crc kubenswrapper[4797]: E0930 18:31:39.239622 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.852718 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-46l8q"] Sep 30 18:31:46 crc kubenswrapper[4797]: E0930 18:31:46.854774 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c273c7c-5815-4b45-b341-67a5cb16a202" containerName="collect-profiles" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.854791 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c273c7c-5815-4b45-b341-67a5cb16a202" containerName="collect-profiles" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.855044 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c273c7c-5815-4b45-b341-67a5cb16a202" containerName="collect-profiles" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.856813 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.897000 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-46l8q"] Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.948906 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-catalog-content\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.950004 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-utilities\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:46 crc kubenswrapper[4797]: I0930 18:31:46.950124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frnf\" (UniqueName: \"kubernetes.io/projected/72a96d00-6548-4cf4-8afb-6707bfc75b89-kube-api-access-5frnf\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.052229 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-utilities\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.052484 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frnf\" (UniqueName: \"kubernetes.io/projected/72a96d00-6548-4cf4-8afb-6707bfc75b89-kube-api-access-5frnf\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.052620 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-catalog-content\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.052899 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-utilities\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.053135 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-catalog-content\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.075068 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frnf\" (UniqueName: \"kubernetes.io/projected/72a96d00-6548-4cf4-8afb-6707bfc75b89-kube-api-access-5frnf\") pod \"redhat-marketplace-46l8q\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.185837 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:47 crc kubenswrapper[4797]: I0930 18:31:47.631184 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-46l8q"] Sep 30 18:31:48 crc kubenswrapper[4797]: I0930 18:31:48.337919 4797 generic.go:334] "Generic (PLEG): container finished" podID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerID="f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394" exitCode=0 Sep 30 18:31:48 crc kubenswrapper[4797]: I0930 18:31:48.338085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46l8q" event={"ID":"72a96d00-6548-4cf4-8afb-6707bfc75b89","Type":"ContainerDied","Data":"f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394"} Sep 30 18:31:48 crc kubenswrapper[4797]: I0930 18:31:48.338371 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46l8q" event={"ID":"72a96d00-6548-4cf4-8afb-6707bfc75b89","Type":"ContainerStarted","Data":"5ba2c3e54a5a5d777ca85c4848bc7725b0c9978625599c3816904b3e3b45cf7e"} Sep 30 18:31:50 crc kubenswrapper[4797]: I0930 18:31:50.362811 4797 generic.go:334] "Generic (PLEG): container finished" podID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerID="520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a" exitCode=0 Sep 30 18:31:50 crc kubenswrapper[4797]: I0930 18:31:50.362934 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46l8q" event={"ID":"72a96d00-6548-4cf4-8afb-6707bfc75b89","Type":"ContainerDied","Data":"520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a"} Sep 30 18:31:51 crc kubenswrapper[4797]: I0930 18:31:51.237847 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:31:51 crc kubenswrapper[4797]: E0930 18:31:51.238884 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:31:51 crc kubenswrapper[4797]: I0930 18:31:51.383485 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46l8q" event={"ID":"72a96d00-6548-4cf4-8afb-6707bfc75b89","Type":"ContainerStarted","Data":"b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64"} Sep 30 18:31:51 crc kubenswrapper[4797]: I0930 18:31:51.421450 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-46l8q" podStartSLOduration=2.822619227 podStartE2EDuration="5.421411448s" podCreationTimestamp="2025-09-30 18:31:46 +0000 UTC" firstStartedPulling="2025-09-30 18:31:48.341858902 +0000 UTC m=+2958.864358170" lastFinishedPulling="2025-09-30 18:31:50.940651113 +0000 UTC m=+2961.463150391" observedRunningTime="2025-09-30 18:31:51.408251618 +0000 UTC m=+2961.930750876" watchObservedRunningTime="2025-09-30 18:31:51.421411448 +0000 UTC m=+2961.943910696" Sep 30 18:31:57 crc kubenswrapper[4797]: I0930 18:31:57.186380 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:57 crc kubenswrapper[4797]: I0930 18:31:57.187009 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:57 crc kubenswrapper[4797]: I0930 18:31:57.244786 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:57 crc kubenswrapper[4797]: I0930 18:31:57.532508 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:31:57 crc kubenswrapper[4797]: I0930 18:31:57.588082 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-46l8q"] Sep 30 18:31:59 crc kubenswrapper[4797]: I0930 18:31:59.502021 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-46l8q" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="registry-server" containerID="cri-o://b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64" gracePeriod=2 Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.048870 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.143006 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5frnf\" (UniqueName: \"kubernetes.io/projected/72a96d00-6548-4cf4-8afb-6707bfc75b89-kube-api-access-5frnf\") pod \"72a96d00-6548-4cf4-8afb-6707bfc75b89\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.143406 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-utilities\") pod \"72a96d00-6548-4cf4-8afb-6707bfc75b89\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.143711 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-catalog-content\") pod \"72a96d00-6548-4cf4-8afb-6707bfc75b89\" (UID: \"72a96d00-6548-4cf4-8afb-6707bfc75b89\") " Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.144916 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-utilities" (OuterVolumeSpecName: "utilities") pod "72a96d00-6548-4cf4-8afb-6707bfc75b89" (UID: "72a96d00-6548-4cf4-8afb-6707bfc75b89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.149664 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a96d00-6548-4cf4-8afb-6707bfc75b89-kube-api-access-5frnf" (OuterVolumeSpecName: "kube-api-access-5frnf") pod "72a96d00-6548-4cf4-8afb-6707bfc75b89" (UID: "72a96d00-6548-4cf4-8afb-6707bfc75b89"). InnerVolumeSpecName "kube-api-access-5frnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.162657 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72a96d00-6548-4cf4-8afb-6707bfc75b89" (UID: "72a96d00-6548-4cf4-8afb-6707bfc75b89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.247055 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5frnf\" (UniqueName: \"kubernetes.io/projected/72a96d00-6548-4cf4-8afb-6707bfc75b89-kube-api-access-5frnf\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.247090 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.247101 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a96d00-6548-4cf4-8afb-6707bfc75b89-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.512690 4797 generic.go:334] "Generic (PLEG): container finished" podID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerID="b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64" exitCode=0 Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.512759 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-46l8q" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.514010 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46l8q" event={"ID":"72a96d00-6548-4cf4-8afb-6707bfc75b89","Type":"ContainerDied","Data":"b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64"} Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.514147 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-46l8q" event={"ID":"72a96d00-6548-4cf4-8afb-6707bfc75b89","Type":"ContainerDied","Data":"5ba2c3e54a5a5d777ca85c4848bc7725b0c9978625599c3816904b3e3b45cf7e"} Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.514229 4797 scope.go:117] "RemoveContainer" containerID="b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.536127 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-46l8q"] Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.537777 4797 scope.go:117] "RemoveContainer" containerID="520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.551528 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-46l8q"] Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.559677 4797 scope.go:117] "RemoveContainer" containerID="f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.601906 4797 scope.go:117] "RemoveContainer" containerID="b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64" Sep 30 18:32:00 crc kubenswrapper[4797]: E0930 18:32:00.602384 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64\": container with ID starting with b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64 not found: ID does not exist" containerID="b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.602565 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64"} err="failed to get container status \"b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64\": rpc error: code = NotFound desc = could not find container \"b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64\": container with ID starting with b4e9345e76ae091c3b1928c10bf9bb2d8e4863e7da20397811a94343fc779b64 not found: ID does not exist" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.602708 4797 scope.go:117] "RemoveContainer" containerID="520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a" Sep 30 18:32:00 crc kubenswrapper[4797]: E0930 18:32:00.603258 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a\": container with ID starting with 520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a not found: ID does not exist" containerID="520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.603382 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a"} err="failed to get container status \"520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a\": rpc error: code = NotFound desc = could not find container \"520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a\": container with ID starting with 520c3427a95963327324787994b5af325507a0a7ff6aabfeadb7d4d9d4d6f11a not found: ID does not exist" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.603517 4797 scope.go:117] "RemoveContainer" containerID="f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394" Sep 30 18:32:00 crc kubenswrapper[4797]: E0930 18:32:00.603829 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394\": container with ID starting with f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394 not found: ID does not exist" containerID="f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394" Sep 30 18:32:00 crc kubenswrapper[4797]: I0930 18:32:00.603941 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394"} err="failed to get container status \"f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394\": rpc error: code = NotFound desc = could not find container \"f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394\": container with ID starting with f264520afb06fb9c4880b6cd5c810a6ed383e076729a1e14acb7e1c601a5f394 not found: ID does not exist" Sep 30 18:32:02 crc kubenswrapper[4797]: I0930 18:32:02.260571 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" path="/var/lib/kubelet/pods/72a96d00-6548-4cf4-8afb-6707bfc75b89/volumes" Sep 30 18:32:04 crc kubenswrapper[4797]: I0930 18:32:04.239967 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:32:04 crc kubenswrapper[4797]: E0930 18:32:04.240419 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:32:16 crc kubenswrapper[4797]: I0930 18:32:16.700545 4797 generic.go:334] "Generic (PLEG): container finished" podID="287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" containerID="6d9817536b05ccb769f8d2a205eb3539e9c7684bd246645852cbbab09067514c" exitCode=0 Sep 30 18:32:16 crc kubenswrapper[4797]: I0930 18:32:16.700651 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" event={"ID":"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3","Type":"ContainerDied","Data":"6d9817536b05ccb769f8d2a205eb3539e9c7684bd246645852cbbab09067514c"} Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.231603 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.239637 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:32:18 crc kubenswrapper[4797]: E0930 18:32:18.240002 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.428770 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-0\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.428928 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-inventory\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.429026 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-telemetry-combined-ca-bundle\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.429146 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6gpj\" (UniqueName: \"kubernetes.io/projected/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-kube-api-access-v6gpj\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.429215 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-1\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.429298 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-2\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.429360 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ssh-key\") pod \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\" (UID: \"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3\") " Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.434523 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-kube-api-access-v6gpj" (OuterVolumeSpecName: "kube-api-access-v6gpj") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "kube-api-access-v6gpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.441706 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.457644 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.458314 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.467552 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.476638 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.488351 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-inventory" (OuterVolumeSpecName: "inventory") pod "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" (UID: "287e8ba5-a33d-49ac-bd3f-b85dd5a401a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.532881 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.532950 4797 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.534002 4797 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.534019 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6gpj\" (UniqueName: \"kubernetes.io/projected/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-kube-api-access-v6gpj\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.534031 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.534045 4797 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.534059 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287e8ba5-a33d-49ac-bd3f-b85dd5a401a3-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.726639 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.726544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm" event={"ID":"287e8ba5-a33d-49ac-bd3f-b85dd5a401a3","Type":"ContainerDied","Data":"feff5502d98f404dc64458a8afa08764ffd547fa9ef2c2f10dc7438765db546c"} Sep 30 18:32:18 crc kubenswrapper[4797]: I0930 18:32:18.726709 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feff5502d98f404dc64458a8afa08764ffd547fa9ef2c2f10dc7438765db546c" Sep 30 18:32:33 crc kubenswrapper[4797]: I0930 18:32:33.238389 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:32:33 crc kubenswrapper[4797]: E0930 18:32:33.240517 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:32:45 crc kubenswrapper[4797]: I0930 18:32:45.239352 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:32:45 crc kubenswrapper[4797]: E0930 18:32:45.240592 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:32:54 crc kubenswrapper[4797]: I0930 18:32:54.254290 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:32:54 crc kubenswrapper[4797]: I0930 18:32:54.255342 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="prometheus" containerID="cri-o://98b0a5b108c6563502107929a2d92851be5ca924b5ec9a9eb068abf5b67c0a0f" gracePeriod=600 Sep 30 18:32:54 crc kubenswrapper[4797]: I0930 18:32:54.255477 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="config-reloader" containerID="cri-o://7fdaa5a582c2a250506cca3e78e660c432758d28c95f692db0c6bbbcc9027b68" gracePeriod=600 Sep 30 18:32:54 crc kubenswrapper[4797]: I0930 18:32:54.255745 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="thanos-sidecar" containerID="cri-o://0e4af960f0a26fef96c5fd646c9bd99282c8a90827127f63d26c69c2bcc27795" gracePeriod=600 Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201159 4797 generic.go:334] "Generic (PLEG): container finished" podID="572d2f77-3315-4b90-860e-18d1973993ef" containerID="0e4af960f0a26fef96c5fd646c9bd99282c8a90827127f63d26c69c2bcc27795" exitCode=0 Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201739 4797 generic.go:334] "Generic (PLEG): container finished" podID="572d2f77-3315-4b90-860e-18d1973993ef" containerID="7fdaa5a582c2a250506cca3e78e660c432758d28c95f692db0c6bbbcc9027b68" exitCode=0 Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201246 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerDied","Data":"0e4af960f0a26fef96c5fd646c9bd99282c8a90827127f63d26c69c2bcc27795"} Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201788 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerDied","Data":"7fdaa5a582c2a250506cca3e78e660c432758d28c95f692db0c6bbbcc9027b68"} Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201802 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerDied","Data":"98b0a5b108c6563502107929a2d92851be5ca924b5ec9a9eb068abf5b67c0a0f"} Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201752 4797 generic.go:334] "Generic (PLEG): container finished" podID="572d2f77-3315-4b90-860e-18d1973993ef" containerID="98b0a5b108c6563502107929a2d92851be5ca924b5ec9a9eb068abf5b67c0a0f" exitCode=0 Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201820 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"572d2f77-3315-4b90-860e-18d1973993ef","Type":"ContainerDied","Data":"10f7865b7fb770d1c7a973fc6284b253a3eb9edbd9d7f03aced4aee64c06a30e"} Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.201831 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f7865b7fb770d1c7a973fc6284b253a3eb9edbd9d7f03aced4aee64c06a30e" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.260293 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.438893 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-thanos-prometheus-http-client-file\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.438981 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-config\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439015 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439101 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/572d2f77-3315-4b90-860e-18d1973993ef-config-out\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439135 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/572d2f77-3315-4b90-860e-18d1973993ef-prometheus-metric-storage-rulefiles-0\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439160 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-tls-assets\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439237 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hgls\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-kube-api-access-5hgls\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439415 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439608 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439654 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-secret-combined-ca-bundle\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.439678 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"572d2f77-3315-4b90-860e-18d1973993ef\" (UID: \"572d2f77-3315-4b90-860e-18d1973993ef\") " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.440119 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/572d2f77-3315-4b90-860e-18d1973993ef-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.474526 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.474616 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.474706 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-kube-api-access-5hgls" (OuterVolumeSpecName: "kube-api-access-5hgls") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "kube-api-access-5hgls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.474749 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-config" (OuterVolumeSpecName: "config") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.474791 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/572d2f77-3315-4b90-860e-18d1973993ef-config-out" (OuterVolumeSpecName: "config-out") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.475852 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.489887 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.493599 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.547913 4797 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/572d2f77-3315-4b90-860e-18d1973993ef-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.547941 4797 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/572d2f77-3315-4b90-860e-18d1973993ef-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.547954 4797 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.547963 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hgls\" (UniqueName: \"kubernetes.io/projected/572d2f77-3315-4b90-860e-18d1973993ef-kube-api-access-5hgls\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.547972 4797 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.547997 4797 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.548010 4797 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.548018 4797 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.548028 4797 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.672101 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config" (OuterVolumeSpecName: "web-config") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.686802 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "572d2f77-3315-4b90-860e-18d1973993ef" (UID: "572d2f77-3315-4b90-860e-18d1973993ef"). InnerVolumeSpecName "pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.751955 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") on node \"crc\" " Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.752144 4797 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/572d2f77-3315-4b90-860e-18d1973993ef-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.774427 4797 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.774678 4797 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf") on node "crc" Sep 30 18:32:55 crc kubenswrapper[4797]: I0930 18:32:55.853885 4797 reconciler_common.go:293] "Volume detached for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") on node \"crc\" DevicePath \"\"" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.210991 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.258984 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.272507 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284089 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284481 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284496 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284505 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="thanos-sidecar" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284511 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="thanos-sidecar" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284523 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="config-reloader" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284528 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="config-reloader" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284538 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="prometheus" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284543 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="prometheus" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284557 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="init-config-reloader" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284563 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="init-config-reloader" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284587 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="extract-utilities" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284592 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="extract-utilities" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284599 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="extract-content" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284606 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="extract-content" Sep 30 18:32:56 crc kubenswrapper[4797]: E0930 18:32:56.284615 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="registry-server" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284620 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="registry-server" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284774 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="287e8ba5-a33d-49ac-bd3f-b85dd5a401a3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284793 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="prometheus" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284800 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="thanos-sidecar" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284809 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="572d2f77-3315-4b90-860e-18d1973993ef" containerName="config-reloader" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.284826 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a96d00-6548-4cf4-8afb-6707bfc75b89" containerName="registry-server" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.294142 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.296497 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.296696 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.296816 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-n6tgk" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.298928 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.300072 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.304473 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.328518 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463525 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463570 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463600 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e430836-74b1-48cb-84bc-f623d27d6c93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463652 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463683 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463716 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463746 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e430836-74b1-48cb-84bc-f623d27d6c93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.463761 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2thc\" (UniqueName: \"kubernetes.io/projected/3e430836-74b1-48cb-84bc-f623d27d6c93-kube-api-access-r2thc\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.464213 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.464269 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.464411 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e430836-74b1-48cb-84bc-f623d27d6c93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.566685 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.566777 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.566847 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.566903 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e430836-74b1-48cb-84bc-f623d27d6c93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.566936 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2thc\" (UniqueName: \"kubernetes.io/projected/3e430836-74b1-48cb-84bc-f623d27d6c93-kube-api-access-r2thc\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.567099 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.567143 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.567210 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e430836-74b1-48cb-84bc-f623d27d6c93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.567256 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.567301 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.567344 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e430836-74b1-48cb-84bc-f623d27d6c93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.568190 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e430836-74b1-48cb-84bc-f623d27d6c93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.574105 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.574208 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.574293 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.574718 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.574853 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.575138 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e430836-74b1-48cb-84bc-f623d27d6c93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.577934 4797 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.577990 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dc90504e8fd72bffa4b364aa4d9dd59b6e7cea028ec589d2165a3270de0ac3cc/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.578009 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e430836-74b1-48cb-84bc-f623d27d6c93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.579950 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/3e430836-74b1-48cb-84bc-f623d27d6c93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.582815 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2thc\" (UniqueName: \"kubernetes.io/projected/3e430836-74b1-48cb-84bc-f623d27d6c93-kube-api-access-r2thc\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.615133 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85d898fb-3c28-48f7-8250-2d2537c8f8cf\") pod \"prometheus-metric-storage-0\" (UID: \"3e430836-74b1-48cb-84bc-f623d27d6c93\") " pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:56 crc kubenswrapper[4797]: I0930 18:32:56.919346 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 18:32:57 crc kubenswrapper[4797]: I0930 18:32:57.451467 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 18:32:58 crc kubenswrapper[4797]: I0930 18:32:58.227629 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e430836-74b1-48cb-84bc-f623d27d6c93","Type":"ContainerStarted","Data":"3b9b7a401699484fe9e20053d00a934a8bea002365f8b017279ee2db8f7f25d6"} Sep 30 18:32:58 crc kubenswrapper[4797]: I0930 18:32:58.239133 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:32:58 crc kubenswrapper[4797]: E0930 18:32:58.239588 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:32:58 crc kubenswrapper[4797]: I0930 18:32:58.251236 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572d2f77-3315-4b90-860e-18d1973993ef" path="/var/lib/kubelet/pods/572d2f77-3315-4b90-860e-18d1973993ef/volumes" Sep 30 18:33:02 crc kubenswrapper[4797]: I0930 18:33:02.264030 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e430836-74b1-48cb-84bc-f623d27d6c93","Type":"ContainerStarted","Data":"22887f37619cd11e1cb9ee7488726d226c8d7f4ed795d2587e25d823614f61c5"} Sep 30 18:33:09 crc kubenswrapper[4797]: I0930 18:33:09.384927 4797 generic.go:334] "Generic (PLEG): container finished" podID="3e430836-74b1-48cb-84bc-f623d27d6c93" containerID="22887f37619cd11e1cb9ee7488726d226c8d7f4ed795d2587e25d823614f61c5" exitCode=0 Sep 30 18:33:09 crc kubenswrapper[4797]: I0930 18:33:09.385036 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e430836-74b1-48cb-84bc-f623d27d6c93","Type":"ContainerDied","Data":"22887f37619cd11e1cb9ee7488726d226c8d7f4ed795d2587e25d823614f61c5"} Sep 30 18:33:10 crc kubenswrapper[4797]: I0930 18:33:10.401963 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e430836-74b1-48cb-84bc-f623d27d6c93","Type":"ContainerStarted","Data":"c9e147b8dfa26eaa5597bf150ac5dd95bd79d9ee5e9a75b5737f49a1abdd5e02"} Sep 30 18:33:12 crc kubenswrapper[4797]: I0930 18:33:12.238057 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:33:12 crc kubenswrapper[4797]: E0930 18:33:12.238696 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:33:13 crc kubenswrapper[4797]: I0930 18:33:13.436286 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e430836-74b1-48cb-84bc-f623d27d6c93","Type":"ContainerStarted","Data":"d9612f98ecc5ccc281e4dc1d0522839368d85dc832158c420daee21353faad90"} Sep 30 18:33:14 crc kubenswrapper[4797]: I0930 18:33:14.451752 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e430836-74b1-48cb-84bc-f623d27d6c93","Type":"ContainerStarted","Data":"7b4d7061973bf5b41f342261873b39df85823ddb2bb6131a42e6ce6d28ba525c"} Sep 30 18:33:14 crc kubenswrapper[4797]: I0930 18:33:14.492596 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.492575347 podStartE2EDuration="18.492575347s" podCreationTimestamp="2025-09-30 18:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:33:14.488852596 +0000 UTC m=+3045.011351864" watchObservedRunningTime="2025-09-30 18:33:14.492575347 +0000 UTC m=+3045.015074605" Sep 30 18:33:16 crc kubenswrapper[4797]: I0930 18:33:16.920551 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 18:33:16 crc kubenswrapper[4797]: I0930 18:33:16.936480 4797 scope.go:117] "RemoveContainer" containerID="5e3044933f061d0aba07cb9247c7818ce313a8ac44f853b909a494d2eb49c327" Sep 30 18:33:16 crc kubenswrapper[4797]: I0930 18:33:16.979932 4797 scope.go:117] "RemoveContainer" containerID="7fdaa5a582c2a250506cca3e78e660c432758d28c95f692db0c6bbbcc9027b68" Sep 30 18:33:17 crc kubenswrapper[4797]: I0930 18:33:17.031121 4797 scope.go:117] "RemoveContainer" containerID="0e4af960f0a26fef96c5fd646c9bd99282c8a90827127f63d26c69c2bcc27795" Sep 30 18:33:17 crc kubenswrapper[4797]: I0930 18:33:17.049888 4797 scope.go:117] "RemoveContainer" containerID="98b0a5b108c6563502107929a2d92851be5ca924b5ec9a9eb068abf5b67c0a0f" Sep 30 18:33:23 crc kubenswrapper[4797]: I0930 18:33:23.238537 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:33:23 crc kubenswrapper[4797]: E0930 18:33:23.239747 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:33:26 crc kubenswrapper[4797]: I0930 18:33:26.919982 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 18:33:26 crc kubenswrapper[4797]: I0930 18:33:26.927596 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 18:33:27 crc kubenswrapper[4797]: I0930 18:33:27.637082 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 18:33:37 crc kubenswrapper[4797]: I0930 18:33:37.238817 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:33:37 crc kubenswrapper[4797]: E0930 18:33:37.240078 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:33:48 crc kubenswrapper[4797]: I0930 18:33:48.241792 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:33:48 crc kubenswrapper[4797]: E0930 18:33:48.242516 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.224639 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.227497 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.232096 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.232096 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mcnmz" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.232676 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.233994 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.271322 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368123 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368252 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368361 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368412 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhr2r\" (UniqueName: \"kubernetes.io/projected/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-kube-api-access-hhr2r\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368491 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368560 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368594 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368697 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-config-data\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.368730 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.470955 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471053 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471169 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-config-data\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471206 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471306 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471381 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471496 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471545 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhr2r\" (UniqueName: \"kubernetes.io/projected/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-kube-api-access-hhr2r\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.471585 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.472111 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.472494 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.472939 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.473010 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-config-data\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.475087 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.478727 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.490750 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.491597 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.502493 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhr2r\" (UniqueName: \"kubernetes.io/projected/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-kube-api-access-hhr2r\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.509573 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " pod="openstack/tempest-tests-tempest" Sep 30 18:33:50 crc kubenswrapper[4797]: I0930 18:33:50.567748 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 18:33:51 crc kubenswrapper[4797]: I0930 18:33:51.143113 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 18:33:51 crc kubenswrapper[4797]: I0930 18:33:51.917843 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"05275916-b3be-4d53-8a06-ab3d5c8b3f7b","Type":"ContainerStarted","Data":"5ad9aa9d3899cb4746565fb20d6078fbd3dabcfd76315e26e69794abcfe249fd"} Sep 30 18:34:00 crc kubenswrapper[4797]: I0930 18:34:00.249495 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:34:00 crc kubenswrapper[4797]: E0930 18:34:00.250181 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:34:04 crc kubenswrapper[4797]: I0930 18:34:04.055031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"05275916-b3be-4d53-8a06-ab3d5c8b3f7b","Type":"ContainerStarted","Data":"24486d04c2bca481ba2dd1fa05c0bb4a4a9591005d7f4bb913ff987d688382e9"} Sep 30 18:34:04 crc kubenswrapper[4797]: I0930 18:34:04.076399 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.457489678 podStartE2EDuration="15.076375407s" podCreationTimestamp="2025-09-30 18:33:49 +0000 UTC" firstStartedPulling="2025-09-30 18:33:51.159220248 +0000 UTC m=+3081.681719506" lastFinishedPulling="2025-09-30 18:34:02.778105987 +0000 UTC m=+3093.300605235" observedRunningTime="2025-09-30 18:34:04.074837354 +0000 UTC m=+3094.597336622" watchObservedRunningTime="2025-09-30 18:34:04.076375407 +0000 UTC m=+3094.598874685" Sep 30 18:34:14 crc kubenswrapper[4797]: I0930 18:34:14.239143 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:34:14 crc kubenswrapper[4797]: E0930 18:34:14.240174 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:34:29 crc kubenswrapper[4797]: I0930 18:34:29.238457 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:34:29 crc kubenswrapper[4797]: E0930 18:34:29.239319 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:34:43 crc kubenswrapper[4797]: I0930 18:34:43.238404 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:34:43 crc kubenswrapper[4797]: E0930 18:34:43.239345 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:34:55 crc kubenswrapper[4797]: I0930 18:34:55.239249 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:34:55 crc kubenswrapper[4797]: E0930 18:34:55.240340 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:35:06 crc kubenswrapper[4797]: I0930 18:35:06.239240 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:35:06 crc kubenswrapper[4797]: E0930 18:35:06.241998 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:35:19 crc kubenswrapper[4797]: I0930 18:35:19.238546 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:35:19 crc kubenswrapper[4797]: E0930 18:35:19.239495 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.561958 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cvsq8"] Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.566124 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.582249 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cvsq8"] Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.765564 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-catalog-content\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.765613 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lqmz\" (UniqueName: \"kubernetes.io/projected/05b2ae33-997e-4b09-bf47-5c447b09a779-kube-api-access-7lqmz\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.765648 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-utilities\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.867102 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-catalog-content\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.867155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lqmz\" (UniqueName: \"kubernetes.io/projected/05b2ae33-997e-4b09-bf47-5c447b09a779-kube-api-access-7lqmz\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.867186 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-utilities\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.867729 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-utilities\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.867759 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-catalog-content\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.896036 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lqmz\" (UniqueName: \"kubernetes.io/projected/05b2ae33-997e-4b09-bf47-5c447b09a779-kube-api-access-7lqmz\") pod \"certified-operators-cvsq8\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:24 crc kubenswrapper[4797]: I0930 18:35:24.913185 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:25 crc kubenswrapper[4797]: I0930 18:35:25.393198 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cvsq8"] Sep 30 18:35:26 crc kubenswrapper[4797]: I0930 18:35:26.098066 4797 generic.go:334] "Generic (PLEG): container finished" podID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerID="649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c" exitCode=0 Sep 30 18:35:26 crc kubenswrapper[4797]: I0930 18:35:26.098721 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerDied","Data":"649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c"} Sep 30 18:35:26 crc kubenswrapper[4797]: I0930 18:35:26.099191 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerStarted","Data":"206acd38070962353f319e122d4c445565685edd7599e699eb19134ed79ce3c1"} Sep 30 18:35:26 crc kubenswrapper[4797]: I0930 18:35:26.101825 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:35:27 crc kubenswrapper[4797]: I0930 18:35:27.110864 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerStarted","Data":"3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23"} Sep 30 18:35:31 crc kubenswrapper[4797]: I0930 18:35:31.238131 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:35:31 crc kubenswrapper[4797]: E0930 18:35:31.239019 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:35:32 crc kubenswrapper[4797]: I0930 18:35:32.177831 4797 generic.go:334] "Generic (PLEG): container finished" podID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerID="3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23" exitCode=0 Sep 30 18:35:32 crc kubenswrapper[4797]: I0930 18:35:32.177893 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerDied","Data":"3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23"} Sep 30 18:35:33 crc kubenswrapper[4797]: I0930 18:35:33.191358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerStarted","Data":"ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8"} Sep 30 18:35:33 crc kubenswrapper[4797]: I0930 18:35:33.226840 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cvsq8" podStartSLOduration=2.694064663 podStartE2EDuration="9.226817815s" podCreationTimestamp="2025-09-30 18:35:24 +0000 UTC" firstStartedPulling="2025-09-30 18:35:26.101641673 +0000 UTC m=+3176.624140911" lastFinishedPulling="2025-09-30 18:35:32.634394835 +0000 UTC m=+3183.156894063" observedRunningTime="2025-09-30 18:35:33.214278282 +0000 UTC m=+3183.736777560" watchObservedRunningTime="2025-09-30 18:35:33.226817815 +0000 UTC m=+3183.749317063" Sep 30 18:35:34 crc kubenswrapper[4797]: I0930 18:35:34.913529 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:34 crc kubenswrapper[4797]: I0930 18:35:34.913839 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:34 crc kubenswrapper[4797]: I0930 18:35:34.982063 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:44 crc kubenswrapper[4797]: I0930 18:35:44.981952 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:45 crc kubenswrapper[4797]: I0930 18:35:45.055657 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cvsq8"] Sep 30 18:35:45 crc kubenswrapper[4797]: I0930 18:35:45.238641 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:35:45 crc kubenswrapper[4797]: E0930 18:35:45.238882 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:35:45 crc kubenswrapper[4797]: I0930 18:35:45.338026 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cvsq8" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="registry-server" containerID="cri-o://ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8" gracePeriod=2 Sep 30 18:35:45 crc kubenswrapper[4797]: E0930 18:35:45.550851 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b2ae33_997e_4b09_bf47_5c447b09a779.slice/crio-ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8.scope\": RecentStats: unable to find data in memory cache]" Sep 30 18:35:45 crc kubenswrapper[4797]: I0930 18:35:45.958157 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.001220 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-catalog-content\") pod \"05b2ae33-997e-4b09-bf47-5c447b09a779\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.002121 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-utilities\") pod \"05b2ae33-997e-4b09-bf47-5c447b09a779\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.002215 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lqmz\" (UniqueName: \"kubernetes.io/projected/05b2ae33-997e-4b09-bf47-5c447b09a779-kube-api-access-7lqmz\") pod \"05b2ae33-997e-4b09-bf47-5c447b09a779\" (UID: \"05b2ae33-997e-4b09-bf47-5c447b09a779\") " Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.032594 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-utilities" (OuterVolumeSpecName: "utilities") pod "05b2ae33-997e-4b09-bf47-5c447b09a779" (UID: "05b2ae33-997e-4b09-bf47-5c447b09a779"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.037557 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b2ae33-997e-4b09-bf47-5c447b09a779-kube-api-access-7lqmz" (OuterVolumeSpecName: "kube-api-access-7lqmz") pod "05b2ae33-997e-4b09-bf47-5c447b09a779" (UID: "05b2ae33-997e-4b09-bf47-5c447b09a779"). InnerVolumeSpecName "kube-api-access-7lqmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.074250 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05b2ae33-997e-4b09-bf47-5c447b09a779" (UID: "05b2ae33-997e-4b09-bf47-5c447b09a779"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.104140 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.104307 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b2ae33-997e-4b09-bf47-5c447b09a779-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.104388 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lqmz\" (UniqueName: \"kubernetes.io/projected/05b2ae33-997e-4b09-bf47-5c447b09a779-kube-api-access-7lqmz\") on node \"crc\" DevicePath \"\"" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.355696 4797 generic.go:334] "Generic (PLEG): container finished" podID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerID="ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8" exitCode=0 Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.355748 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerDied","Data":"ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8"} Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.355834 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvsq8" event={"ID":"05b2ae33-997e-4b09-bf47-5c447b09a779","Type":"ContainerDied","Data":"206acd38070962353f319e122d4c445565685edd7599e699eb19134ed79ce3c1"} Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.355868 4797 scope.go:117] "RemoveContainer" containerID="ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.356246 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvsq8" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.398722 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cvsq8"] Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.407202 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cvsq8"] Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.407318 4797 scope.go:117] "RemoveContainer" containerID="3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.444181 4797 scope.go:117] "RemoveContainer" containerID="649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.489348 4797 scope.go:117] "RemoveContainer" containerID="ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8" Sep 30 18:35:46 crc kubenswrapper[4797]: E0930 18:35:46.491357 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8\": container with ID starting with ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8 not found: ID does not exist" containerID="ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.491500 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8"} err="failed to get container status \"ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8\": rpc error: code = NotFound desc = could not find container \"ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8\": container with ID starting with ca7ae72ffd918d5a2d39d0f895274934e3cfd79c4f4ec909ba192284e810f2e8 not found: ID does not exist" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.491560 4797 scope.go:117] "RemoveContainer" containerID="3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23" Sep 30 18:35:46 crc kubenswrapper[4797]: E0930 18:35:46.492271 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23\": container with ID starting with 3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23 not found: ID does not exist" containerID="3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.492333 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23"} err="failed to get container status \"3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23\": rpc error: code = NotFound desc = could not find container \"3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23\": container with ID starting with 3c6bf4f606b0a46f0eaaec5028ea70a68ace8cc33b1b5e8cc731c1e369147b23 not found: ID does not exist" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.492375 4797 scope.go:117] "RemoveContainer" containerID="649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c" Sep 30 18:35:46 crc kubenswrapper[4797]: E0930 18:35:46.492998 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c\": container with ID starting with 649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c not found: ID does not exist" containerID="649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c" Sep 30 18:35:46 crc kubenswrapper[4797]: I0930 18:35:46.493077 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c"} err="failed to get container status \"649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c\": rpc error: code = NotFound desc = could not find container \"649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c\": container with ID starting with 649d6c697fba4c13aee9f42d1e5808e5f1524d2cc121bb528581cffbc1f01d8c not found: ID does not exist" Sep 30 18:35:48 crc kubenswrapper[4797]: I0930 18:35:48.253749 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" path="/var/lib/kubelet/pods/05b2ae33-997e-4b09-bf47-5c447b09a779/volumes" Sep 30 18:35:58 crc kubenswrapper[4797]: I0930 18:35:58.238554 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:35:58 crc kubenswrapper[4797]: E0930 18:35:58.239690 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:36:11 crc kubenswrapper[4797]: I0930 18:36:11.238412 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:36:11 crc kubenswrapper[4797]: E0930 18:36:11.239273 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:36:24 crc kubenswrapper[4797]: I0930 18:36:24.238560 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:36:24 crc kubenswrapper[4797]: I0930 18:36:24.857681 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"9e1d01299a219d7c8a162c539533219f9e4305fba180f202f73fcc62118e96d2"} Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.842884 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qspk"] Sep 30 18:36:43 crc kubenswrapper[4797]: E0930 18:36:43.844141 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="extract-content" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.844163 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="extract-content" Sep 30 18:36:43 crc kubenswrapper[4797]: E0930 18:36:43.844186 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="registry-server" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.844201 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="registry-server" Sep 30 18:36:43 crc kubenswrapper[4797]: E0930 18:36:43.844229 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="extract-utilities" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.844243 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="extract-utilities" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.844664 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b2ae33-997e-4b09-bf47-5c447b09a779" containerName="registry-server" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.847694 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.862723 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qspk"] Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.995529 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfccv\" (UniqueName: \"kubernetes.io/projected/c2013ed6-9d45-4683-b274-6c9d780c1ae4-kube-api-access-nfccv\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.995727 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-catalog-content\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:43 crc kubenswrapper[4797]: I0930 18:36:43.995790 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-utilities\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.097954 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfccv\" (UniqueName: \"kubernetes.io/projected/c2013ed6-9d45-4683-b274-6c9d780c1ae4-kube-api-access-nfccv\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.098045 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-catalog-content\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.098082 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-utilities\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.099015 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-utilities\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.099033 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-catalog-content\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.133069 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfccv\" (UniqueName: \"kubernetes.io/projected/c2013ed6-9d45-4683-b274-6c9d780c1ae4-kube-api-access-nfccv\") pod \"redhat-operators-7qspk\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.180220 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:44 crc kubenswrapper[4797]: I0930 18:36:44.676746 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qspk"] Sep 30 18:36:44 crc kubenswrapper[4797]: W0930 18:36:44.682405 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2013ed6_9d45_4683_b274_6c9d780c1ae4.slice/crio-108202d04843255a2afc28eb0965e12afd443bc850a0d665b86401979d3c1360 WatchSource:0}: Error finding container 108202d04843255a2afc28eb0965e12afd443bc850a0d665b86401979d3c1360: Status 404 returned error can't find the container with id 108202d04843255a2afc28eb0965e12afd443bc850a0d665b86401979d3c1360 Sep 30 18:36:45 crc kubenswrapper[4797]: I0930 18:36:45.122539 4797 generic.go:334] "Generic (PLEG): container finished" podID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerID="791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6" exitCode=0 Sep 30 18:36:45 crc kubenswrapper[4797]: I0930 18:36:45.122596 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerDied","Data":"791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6"} Sep 30 18:36:45 crc kubenswrapper[4797]: I0930 18:36:45.122914 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerStarted","Data":"108202d04843255a2afc28eb0965e12afd443bc850a0d665b86401979d3c1360"} Sep 30 18:36:47 crc kubenswrapper[4797]: I0930 18:36:47.149277 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerStarted","Data":"1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c"} Sep 30 18:36:50 crc kubenswrapper[4797]: I0930 18:36:50.191670 4797 generic.go:334] "Generic (PLEG): container finished" podID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerID="1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c" exitCode=0 Sep 30 18:36:50 crc kubenswrapper[4797]: I0930 18:36:50.191760 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerDied","Data":"1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c"} Sep 30 18:36:51 crc kubenswrapper[4797]: I0930 18:36:51.202962 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerStarted","Data":"ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365"} Sep 30 18:36:51 crc kubenswrapper[4797]: I0930 18:36:51.243006 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qspk" podStartSLOduration=2.723672648 podStartE2EDuration="8.242972553s" podCreationTimestamp="2025-09-30 18:36:43 +0000 UTC" firstStartedPulling="2025-09-30 18:36:45.124700287 +0000 UTC m=+3255.647199525" lastFinishedPulling="2025-09-30 18:36:50.644000182 +0000 UTC m=+3261.166499430" observedRunningTime="2025-09-30 18:36:51.220845356 +0000 UTC m=+3261.743344614" watchObservedRunningTime="2025-09-30 18:36:51.242972553 +0000 UTC m=+3261.765471861" Sep 30 18:36:54 crc kubenswrapper[4797]: I0930 18:36:54.180726 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:54 crc kubenswrapper[4797]: I0930 18:36:54.181389 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:36:55 crc kubenswrapper[4797]: I0930 18:36:55.255796 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qspk" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="registry-server" probeResult="failure" output=< Sep 30 18:36:55 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 18:36:55 crc kubenswrapper[4797]: > Sep 30 18:37:04 crc kubenswrapper[4797]: I0930 18:37:04.252554 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:37:04 crc kubenswrapper[4797]: I0930 18:37:04.321151 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:37:04 crc kubenswrapper[4797]: I0930 18:37:04.495709 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qspk"] Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.366658 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qspk" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="registry-server" containerID="cri-o://ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365" gracePeriod=2 Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.876268 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.982562 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-utilities\") pod \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.982737 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-catalog-content\") pod \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.982787 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfccv\" (UniqueName: \"kubernetes.io/projected/c2013ed6-9d45-4683-b274-6c9d780c1ae4-kube-api-access-nfccv\") pod \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\" (UID: \"c2013ed6-9d45-4683-b274-6c9d780c1ae4\") " Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.983672 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-utilities" (OuterVolumeSpecName: "utilities") pod "c2013ed6-9d45-4683-b274-6c9d780c1ae4" (UID: "c2013ed6-9d45-4683-b274-6c9d780c1ae4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:37:05 crc kubenswrapper[4797]: I0930 18:37:05.990232 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2013ed6-9d45-4683-b274-6c9d780c1ae4-kube-api-access-nfccv" (OuterVolumeSpecName: "kube-api-access-nfccv") pod "c2013ed6-9d45-4683-b274-6c9d780c1ae4" (UID: "c2013ed6-9d45-4683-b274-6c9d780c1ae4"). InnerVolumeSpecName "kube-api-access-nfccv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.081172 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2013ed6-9d45-4683-b274-6c9d780c1ae4" (UID: "c2013ed6-9d45-4683-b274-6c9d780c1ae4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.086346 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.086392 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2013ed6-9d45-4683-b274-6c9d780c1ae4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.086468 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfccv\" (UniqueName: \"kubernetes.io/projected/c2013ed6-9d45-4683-b274-6c9d780c1ae4-kube-api-access-nfccv\") on node \"crc\" DevicePath \"\"" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.376769 4797 generic.go:334] "Generic (PLEG): container finished" podID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerID="ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365" exitCode=0 Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.376808 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerDied","Data":"ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365"} Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.376818 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qspk" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.376834 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qspk" event={"ID":"c2013ed6-9d45-4683-b274-6c9d780c1ae4","Type":"ContainerDied","Data":"108202d04843255a2afc28eb0965e12afd443bc850a0d665b86401979d3c1360"} Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.376851 4797 scope.go:117] "RemoveContainer" containerID="ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.401580 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qspk"] Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.411908 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qspk"] Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.421496 4797 scope.go:117] "RemoveContainer" containerID="1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.446659 4797 scope.go:117] "RemoveContainer" containerID="791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.488840 4797 scope.go:117] "RemoveContainer" containerID="ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365" Sep 30 18:37:06 crc kubenswrapper[4797]: E0930 18:37:06.490796 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365\": container with ID starting with ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365 not found: ID does not exist" containerID="ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.490893 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365"} err="failed to get container status \"ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365\": rpc error: code = NotFound desc = could not find container \"ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365\": container with ID starting with ae8eb0e03073f6b411c403af9166301b6784200c0a35a550cb9f518b58554365 not found: ID does not exist" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.490961 4797 scope.go:117] "RemoveContainer" containerID="1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c" Sep 30 18:37:06 crc kubenswrapper[4797]: E0930 18:37:06.491550 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c\": container with ID starting with 1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c not found: ID does not exist" containerID="1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.491615 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c"} err="failed to get container status \"1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c\": rpc error: code = NotFound desc = could not find container \"1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c\": container with ID starting with 1cb1dea66b06ddde31c5b8b01789634dd028b6bab199b3778a299861892fbb8c not found: ID does not exist" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.491638 4797 scope.go:117] "RemoveContainer" containerID="791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6" Sep 30 18:37:06 crc kubenswrapper[4797]: E0930 18:37:06.491877 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6\": container with ID starting with 791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6 not found: ID does not exist" containerID="791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6" Sep 30 18:37:06 crc kubenswrapper[4797]: I0930 18:37:06.491914 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6"} err="failed to get container status \"791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6\": rpc error: code = NotFound desc = could not find container \"791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6\": container with ID starting with 791c6a201a08f4bb438cae450b9ef6ba887753a63cebf73957030b7413b0fbe6 not found: ID does not exist" Sep 30 18:37:08 crc kubenswrapper[4797]: I0930 18:37:08.251451 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" path="/var/lib/kubelet/pods/c2013ed6-9d45-4683-b274-6c9d780c1ae4/volumes" Sep 30 18:38:44 crc kubenswrapper[4797]: I0930 18:38:44.191861 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:38:44 crc kubenswrapper[4797]: I0930 18:38:44.194347 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:39:14 crc kubenswrapper[4797]: I0930 18:39:14.192154 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:39:14 crc kubenswrapper[4797]: I0930 18:39:14.192868 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.454024 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7bdx7"] Sep 30 18:39:40 crc kubenswrapper[4797]: E0930 18:39:40.456205 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="extract-utilities" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.456315 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="extract-utilities" Sep 30 18:39:40 crc kubenswrapper[4797]: E0930 18:39:40.456425 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="extract-content" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.456591 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="extract-content" Sep 30 18:39:40 crc kubenswrapper[4797]: E0930 18:39:40.456686 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="registry-server" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.456862 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="registry-server" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.457243 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2013ed6-9d45-4683-b274-6c9d780c1ae4" containerName="registry-server" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.459703 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.468808 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bdx7"] Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.598813 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-catalog-content\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.599208 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdpvx\" (UniqueName: \"kubernetes.io/projected/c74214c8-2863-4426-ac97-68c5492c7c0c-kube-api-access-wdpvx\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.599463 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-utilities\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.701382 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-utilities\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.701529 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-catalog-content\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.701577 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdpvx\" (UniqueName: \"kubernetes.io/projected/c74214c8-2863-4426-ac97-68c5492c7c0c-kube-api-access-wdpvx\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.701880 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-utilities\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.702099 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-catalog-content\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.723064 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdpvx\" (UniqueName: \"kubernetes.io/projected/c74214c8-2863-4426-ac97-68c5492c7c0c-kube-api-access-wdpvx\") pod \"community-operators-7bdx7\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:40 crc kubenswrapper[4797]: I0930 18:39:40.806235 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:41 crc kubenswrapper[4797]: I0930 18:39:41.347159 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bdx7"] Sep 30 18:39:42 crc kubenswrapper[4797]: I0930 18:39:42.101940 4797 generic.go:334] "Generic (PLEG): container finished" podID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerID="5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb" exitCode=0 Sep 30 18:39:42 crc kubenswrapper[4797]: I0930 18:39:42.102076 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerDied","Data":"5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb"} Sep 30 18:39:42 crc kubenswrapper[4797]: I0930 18:39:42.102751 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerStarted","Data":"c1a08b39d1efdf9e910096d108cdfca995a1f237db287061b3aaec0e2e79a229"} Sep 30 18:39:44 crc kubenswrapper[4797]: I0930 18:39:44.132134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerStarted","Data":"9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e"} Sep 30 18:39:44 crc kubenswrapper[4797]: I0930 18:39:44.191586 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:39:44 crc kubenswrapper[4797]: I0930 18:39:44.191642 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:39:44 crc kubenswrapper[4797]: I0930 18:39:44.191689 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:39:44 crc kubenswrapper[4797]: I0930 18:39:44.192538 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e1d01299a219d7c8a162c539533219f9e4305fba180f202f73fcc62118e96d2"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:39:44 crc kubenswrapper[4797]: I0930 18:39:44.192609 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://9e1d01299a219d7c8a162c539533219f9e4305fba180f202f73fcc62118e96d2" gracePeriod=600 Sep 30 18:39:45 crc kubenswrapper[4797]: I0930 18:39:45.147306 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="9e1d01299a219d7c8a162c539533219f9e4305fba180f202f73fcc62118e96d2" exitCode=0 Sep 30 18:39:45 crc kubenswrapper[4797]: I0930 18:39:45.147351 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"9e1d01299a219d7c8a162c539533219f9e4305fba180f202f73fcc62118e96d2"} Sep 30 18:39:45 crc kubenswrapper[4797]: I0930 18:39:45.148089 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125"} Sep 30 18:39:45 crc kubenswrapper[4797]: I0930 18:39:45.148124 4797 scope.go:117] "RemoveContainer" containerID="fd15a4fb76746f523e3e0c4bb0edfec9f09073cc66260a897dc922f6c7468478" Sep 30 18:39:45 crc kubenswrapper[4797]: I0930 18:39:45.151592 4797 generic.go:334] "Generic (PLEG): container finished" podID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerID="9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e" exitCode=0 Sep 30 18:39:45 crc kubenswrapper[4797]: I0930 18:39:45.151663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerDied","Data":"9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e"} Sep 30 18:39:46 crc kubenswrapper[4797]: I0930 18:39:46.171609 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerStarted","Data":"1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354"} Sep 30 18:39:46 crc kubenswrapper[4797]: I0930 18:39:46.203187 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7bdx7" podStartSLOduration=2.6435115319999998 podStartE2EDuration="6.203168468s" podCreationTimestamp="2025-09-30 18:39:40 +0000 UTC" firstStartedPulling="2025-09-30 18:39:42.10358527 +0000 UTC m=+3432.626084528" lastFinishedPulling="2025-09-30 18:39:45.663242216 +0000 UTC m=+3436.185741464" observedRunningTime="2025-09-30 18:39:46.197256506 +0000 UTC m=+3436.719755754" watchObservedRunningTime="2025-09-30 18:39:46.203168468 +0000 UTC m=+3436.725667716" Sep 30 18:39:50 crc kubenswrapper[4797]: I0930 18:39:50.806679 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:50 crc kubenswrapper[4797]: I0930 18:39:50.807372 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:50 crc kubenswrapper[4797]: I0930 18:39:50.868917 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:51 crc kubenswrapper[4797]: I0930 18:39:51.274117 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:51 crc kubenswrapper[4797]: I0930 18:39:51.330466 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bdx7"] Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.247779 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7bdx7" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="registry-server" containerID="cri-o://1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354" gracePeriod=2 Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.828740 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.874498 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdpvx\" (UniqueName: \"kubernetes.io/projected/c74214c8-2863-4426-ac97-68c5492c7c0c-kube-api-access-wdpvx\") pod \"c74214c8-2863-4426-ac97-68c5492c7c0c\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.874645 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-catalog-content\") pod \"c74214c8-2863-4426-ac97-68c5492c7c0c\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.874744 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-utilities\") pod \"c74214c8-2863-4426-ac97-68c5492c7c0c\" (UID: \"c74214c8-2863-4426-ac97-68c5492c7c0c\") " Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.875852 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-utilities" (OuterVolumeSpecName: "utilities") pod "c74214c8-2863-4426-ac97-68c5492c7c0c" (UID: "c74214c8-2863-4426-ac97-68c5492c7c0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.880220 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74214c8-2863-4426-ac97-68c5492c7c0c-kube-api-access-wdpvx" (OuterVolumeSpecName: "kube-api-access-wdpvx") pod "c74214c8-2863-4426-ac97-68c5492c7c0c" (UID: "c74214c8-2863-4426-ac97-68c5492c7c0c"). InnerVolumeSpecName "kube-api-access-wdpvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.921564 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c74214c8-2863-4426-ac97-68c5492c7c0c" (UID: "c74214c8-2863-4426-ac97-68c5492c7c0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.976875 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdpvx\" (UniqueName: \"kubernetes.io/projected/c74214c8-2863-4426-ac97-68c5492c7c0c-kube-api-access-wdpvx\") on node \"crc\" DevicePath \"\"" Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.976916 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:39:53 crc kubenswrapper[4797]: I0930 18:39:53.976928 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74214c8-2863-4426-ac97-68c5492c7c0c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.258640 4797 generic.go:334] "Generic (PLEG): container finished" podID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerID="1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354" exitCode=0 Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.258690 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerDied","Data":"1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354"} Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.258730 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bdx7" event={"ID":"c74214c8-2863-4426-ac97-68c5492c7c0c","Type":"ContainerDied","Data":"c1a08b39d1efdf9e910096d108cdfca995a1f237db287061b3aaec0e2e79a229"} Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.258747 4797 scope.go:117] "RemoveContainer" containerID="1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.259541 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bdx7" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.297305 4797 scope.go:117] "RemoveContainer" containerID="9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.317496 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bdx7"] Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.326706 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7bdx7"] Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.333302 4797 scope.go:117] "RemoveContainer" containerID="5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.382236 4797 scope.go:117] "RemoveContainer" containerID="1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354" Sep 30 18:39:54 crc kubenswrapper[4797]: E0930 18:39:54.382801 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354\": container with ID starting with 1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354 not found: ID does not exist" containerID="1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.382852 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354"} err="failed to get container status \"1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354\": rpc error: code = NotFound desc = could not find container \"1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354\": container with ID starting with 1a7bfd520900d1b4a0baef7d94fd3c6d80769ea6c91656761931bfa9bb814354 not found: ID does not exist" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.382883 4797 scope.go:117] "RemoveContainer" containerID="9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e" Sep 30 18:39:54 crc kubenswrapper[4797]: E0930 18:39:54.383156 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e\": container with ID starting with 9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e not found: ID does not exist" containerID="9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.383182 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e"} err="failed to get container status \"9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e\": rpc error: code = NotFound desc = could not find container \"9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e\": container with ID starting with 9cd320b71c72bb437b85791b344492c7fe0bb4e0e51dd2cdb3e488ce591f432e not found: ID does not exist" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.383199 4797 scope.go:117] "RemoveContainer" containerID="5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb" Sep 30 18:39:54 crc kubenswrapper[4797]: E0930 18:39:54.383451 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb\": container with ID starting with 5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb not found: ID does not exist" containerID="5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb" Sep 30 18:39:54 crc kubenswrapper[4797]: I0930 18:39:54.383478 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb"} err="failed to get container status \"5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb\": rpc error: code = NotFound desc = could not find container \"5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb\": container with ID starting with 5451f290c8921b06f378a02a45db5596d2d9b1aca57fe7e3aed827f2ebe298fb not found: ID does not exist" Sep 30 18:39:56 crc kubenswrapper[4797]: I0930 18:39:56.267629 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" path="/var/lib/kubelet/pods/c74214c8-2863-4426-ac97-68c5492c7c0c/volumes" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.172449 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqxp"] Sep 30 18:42:04 crc kubenswrapper[4797]: E0930 18:42:04.176375 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="extract-content" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.176516 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="extract-content" Sep 30 18:42:04 crc kubenswrapper[4797]: E0930 18:42:04.176537 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="extract-utilities" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.176546 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="extract-utilities" Sep 30 18:42:04 crc kubenswrapper[4797]: E0930 18:42:04.176586 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="registry-server" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.176595 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="registry-server" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.176915 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74214c8-2863-4426-ac97-68c5492c7c0c" containerName="registry-server" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.178644 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.182685 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqxp"] Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.347020 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-catalog-content\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.347297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-utilities\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.347422 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhl4\" (UniqueName: \"kubernetes.io/projected/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-kube-api-access-chhl4\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.449371 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-catalog-content\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.449462 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-utilities\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.449498 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chhl4\" (UniqueName: \"kubernetes.io/projected/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-kube-api-access-chhl4\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.449978 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-catalog-content\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.450106 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-utilities\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.480146 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhl4\" (UniqueName: \"kubernetes.io/projected/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-kube-api-access-chhl4\") pod \"redhat-marketplace-wrqxp\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:04 crc kubenswrapper[4797]: I0930 18:42:04.499212 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:05 crc kubenswrapper[4797]: I0930 18:42:05.034446 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqxp"] Sep 30 18:42:05 crc kubenswrapper[4797]: W0930 18:42:05.041577 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded55fe04_7b28_4f8b_9b8b_76b20fa26598.slice/crio-e1056d187ea5e4068d728149e86d0a35ac755f7f7a2ee01ad374753d3be9b680 WatchSource:0}: Error finding container e1056d187ea5e4068d728149e86d0a35ac755f7f7a2ee01ad374753d3be9b680: Status 404 returned error can't find the container with id e1056d187ea5e4068d728149e86d0a35ac755f7f7a2ee01ad374753d3be9b680 Sep 30 18:42:05 crc kubenswrapper[4797]: I0930 18:42:05.659349 4797 generic.go:334] "Generic (PLEG): container finished" podID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerID="c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537" exitCode=0 Sep 30 18:42:05 crc kubenswrapper[4797]: I0930 18:42:05.659495 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerDied","Data":"c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537"} Sep 30 18:42:05 crc kubenswrapper[4797]: I0930 18:42:05.659710 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerStarted","Data":"e1056d187ea5e4068d728149e86d0a35ac755f7f7a2ee01ad374753d3be9b680"} Sep 30 18:42:05 crc kubenswrapper[4797]: I0930 18:42:05.664916 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:42:06 crc kubenswrapper[4797]: I0930 18:42:06.670359 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerStarted","Data":"4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4"} Sep 30 18:42:07 crc kubenswrapper[4797]: I0930 18:42:07.680497 4797 generic.go:334] "Generic (PLEG): container finished" podID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerID="4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4" exitCode=0 Sep 30 18:42:07 crc kubenswrapper[4797]: I0930 18:42:07.680671 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerDied","Data":"4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4"} Sep 30 18:42:08 crc kubenswrapper[4797]: I0930 18:42:08.693925 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerStarted","Data":"8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70"} Sep 30 18:42:08 crc kubenswrapper[4797]: I0930 18:42:08.717705 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wrqxp" podStartSLOduration=2.178955235 podStartE2EDuration="4.71768172s" podCreationTimestamp="2025-09-30 18:42:04 +0000 UTC" firstStartedPulling="2025-09-30 18:42:05.664634174 +0000 UTC m=+3576.187133422" lastFinishedPulling="2025-09-30 18:42:08.203360669 +0000 UTC m=+3578.725859907" observedRunningTime="2025-09-30 18:42:08.71364584 +0000 UTC m=+3579.236145098" watchObservedRunningTime="2025-09-30 18:42:08.71768172 +0000 UTC m=+3579.240180978" Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.191714 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.192200 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.499516 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.500529 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.595044 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.795642 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:14 crc kubenswrapper[4797]: I0930 18:42:14.854912 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqxp"] Sep 30 18:42:16 crc kubenswrapper[4797]: I0930 18:42:16.772702 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wrqxp" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="registry-server" containerID="cri-o://8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70" gracePeriod=2 Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.251868 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.410875 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-utilities\") pod \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.411134 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-catalog-content\") pod \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.411172 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chhl4\" (UniqueName: \"kubernetes.io/projected/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-kube-api-access-chhl4\") pod \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\" (UID: \"ed55fe04-7b28-4f8b-9b8b-76b20fa26598\") " Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.412258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-utilities" (OuterVolumeSpecName: "utilities") pod "ed55fe04-7b28-4f8b-9b8b-76b20fa26598" (UID: "ed55fe04-7b28-4f8b-9b8b-76b20fa26598"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.418791 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-kube-api-access-chhl4" (OuterVolumeSpecName: "kube-api-access-chhl4") pod "ed55fe04-7b28-4f8b-9b8b-76b20fa26598" (UID: "ed55fe04-7b28-4f8b-9b8b-76b20fa26598"). InnerVolumeSpecName "kube-api-access-chhl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.431149 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed55fe04-7b28-4f8b-9b8b-76b20fa26598" (UID: "ed55fe04-7b28-4f8b-9b8b-76b20fa26598"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.514134 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.514170 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.514188 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chhl4\" (UniqueName: \"kubernetes.io/projected/ed55fe04-7b28-4f8b-9b8b-76b20fa26598-kube-api-access-chhl4\") on node \"crc\" DevicePath \"\"" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.785654 4797 generic.go:334] "Generic (PLEG): container finished" podID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerID="8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70" exitCode=0 Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.785931 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerDied","Data":"8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70"} Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.785963 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrqxp" event={"ID":"ed55fe04-7b28-4f8b-9b8b-76b20fa26598","Type":"ContainerDied","Data":"e1056d187ea5e4068d728149e86d0a35ac755f7f7a2ee01ad374753d3be9b680"} Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.785983 4797 scope.go:117] "RemoveContainer" containerID="8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.786119 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrqxp" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.807421 4797 scope.go:117] "RemoveContainer" containerID="4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.839175 4797 scope.go:117] "RemoveContainer" containerID="c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.844053 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqxp"] Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.855479 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrqxp"] Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.884605 4797 scope.go:117] "RemoveContainer" containerID="8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70" Sep 30 18:42:17 crc kubenswrapper[4797]: E0930 18:42:17.885018 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70\": container with ID starting with 8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70 not found: ID does not exist" containerID="8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.885055 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70"} err="failed to get container status \"8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70\": rpc error: code = NotFound desc = could not find container \"8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70\": container with ID starting with 8a5b21416c57b57cf337f2a70095ffcfae6a8c201d4d9984bc9b3b1bd4c80c70 not found: ID does not exist" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.885083 4797 scope.go:117] "RemoveContainer" containerID="4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4" Sep 30 18:42:17 crc kubenswrapper[4797]: E0930 18:42:17.885759 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4\": container with ID starting with 4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4 not found: ID does not exist" containerID="4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.885786 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4"} err="failed to get container status \"4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4\": rpc error: code = NotFound desc = could not find container \"4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4\": container with ID starting with 4a069178e61f33bf440c7dbb2f59a3de4a9fbb26b41d5f2c8834cd51629f43f4 not found: ID does not exist" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.885806 4797 scope.go:117] "RemoveContainer" containerID="c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537" Sep 30 18:42:17 crc kubenswrapper[4797]: E0930 18:42:17.886026 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537\": container with ID starting with c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537 not found: ID does not exist" containerID="c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537" Sep 30 18:42:17 crc kubenswrapper[4797]: I0930 18:42:17.886133 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537"} err="failed to get container status \"c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537\": rpc error: code = NotFound desc = could not find container \"c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537\": container with ID starting with c42863574badfdc1a058d62034afe4553e4f65f25e4ddb0ad0a058cd099ef537 not found: ID does not exist" Sep 30 18:42:18 crc kubenswrapper[4797]: I0930 18:42:18.247774 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" path="/var/lib/kubelet/pods/ed55fe04-7b28-4f8b-9b8b-76b20fa26598/volumes" Sep 30 18:42:44 crc kubenswrapper[4797]: I0930 18:42:44.192355 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:42:44 crc kubenswrapper[4797]: I0930 18:42:44.193707 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.193130 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.195219 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.195338 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.196622 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.196728 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" gracePeriod=600 Sep 30 18:43:14 crc kubenswrapper[4797]: E0930 18:43:14.337725 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.501198 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" exitCode=0 Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.501265 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125"} Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.501320 4797 scope.go:117] "RemoveContainer" containerID="9e1d01299a219d7c8a162c539533219f9e4305fba180f202f73fcc62118e96d2" Sep 30 18:43:14 crc kubenswrapper[4797]: I0930 18:43:14.502160 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:43:14 crc kubenswrapper[4797]: E0930 18:43:14.502772 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:43:29 crc kubenswrapper[4797]: I0930 18:43:29.239464 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:43:29 crc kubenswrapper[4797]: E0930 18:43:29.240325 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:43:41 crc kubenswrapper[4797]: I0930 18:43:41.238727 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:43:41 crc kubenswrapper[4797]: E0930 18:43:41.239868 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:43:52 crc kubenswrapper[4797]: I0930 18:43:52.241239 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:43:52 crc kubenswrapper[4797]: E0930 18:43:52.242263 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:44:04 crc kubenswrapper[4797]: I0930 18:44:04.238414 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:44:04 crc kubenswrapper[4797]: E0930 18:44:04.239173 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:44:19 crc kubenswrapper[4797]: I0930 18:44:19.238722 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:44:19 crc kubenswrapper[4797]: E0930 18:44:19.239506 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:44:30 crc kubenswrapper[4797]: I0930 18:44:30.247951 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:44:30 crc kubenswrapper[4797]: E0930 18:44:30.249729 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:44:42 crc kubenswrapper[4797]: I0930 18:44:42.238277 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:44:42 crc kubenswrapper[4797]: E0930 18:44:42.238902 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:44:56 crc kubenswrapper[4797]: I0930 18:44:56.238512 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:44:56 crc kubenswrapper[4797]: E0930 18:44:56.239300 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.169327 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc"] Sep 30 18:45:00 crc kubenswrapper[4797]: E0930 18:45:00.170748 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="extract-content" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.170773 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="extract-content" Sep 30 18:45:00 crc kubenswrapper[4797]: E0930 18:45:00.170825 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="registry-server" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.170843 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="registry-server" Sep 30 18:45:00 crc kubenswrapper[4797]: E0930 18:45:00.170877 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="extract-utilities" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.170894 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="extract-utilities" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.171382 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed55fe04-7b28-4f8b-9b8b-76b20fa26598" containerName="registry-server" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.172962 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.175993 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.176097 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.183895 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc"] Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.311949 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40e54865-5d25-47e2-ae20-0db2c3c3a44b-secret-volume\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.312354 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40e54865-5d25-47e2-ae20-0db2c3c3a44b-config-volume\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.312399 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsr9z\" (UniqueName: \"kubernetes.io/projected/40e54865-5d25-47e2-ae20-0db2c3c3a44b-kube-api-access-dsr9z\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.414058 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40e54865-5d25-47e2-ae20-0db2c3c3a44b-secret-volume\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.414198 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40e54865-5d25-47e2-ae20-0db2c3c3a44b-config-volume\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.414261 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsr9z\" (UniqueName: \"kubernetes.io/projected/40e54865-5d25-47e2-ae20-0db2c3c3a44b-kube-api-access-dsr9z\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.415425 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40e54865-5d25-47e2-ae20-0db2c3c3a44b-config-volume\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.424044 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40e54865-5d25-47e2-ae20-0db2c3c3a44b-secret-volume\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.444200 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsr9z\" (UniqueName: \"kubernetes.io/projected/40e54865-5d25-47e2-ae20-0db2c3c3a44b-kube-api-access-dsr9z\") pod \"collect-profiles-29320965-6kchc\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:00 crc kubenswrapper[4797]: I0930 18:45:00.506045 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:01 crc kubenswrapper[4797]: I0930 18:45:01.052463 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc"] Sep 30 18:45:01 crc kubenswrapper[4797]: I0930 18:45:01.760507 4797 generic.go:334] "Generic (PLEG): container finished" podID="40e54865-5d25-47e2-ae20-0db2c3c3a44b" containerID="21f2908886b56ef00ffb32b1f0cba4efb4c80e2ba85bd670e2b6ef999539c590" exitCode=0 Sep 30 18:45:01 crc kubenswrapper[4797]: I0930 18:45:01.760553 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" event={"ID":"40e54865-5d25-47e2-ae20-0db2c3c3a44b","Type":"ContainerDied","Data":"21f2908886b56ef00ffb32b1f0cba4efb4c80e2ba85bd670e2b6ef999539c590"} Sep 30 18:45:01 crc kubenswrapper[4797]: I0930 18:45:01.760798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" event={"ID":"40e54865-5d25-47e2-ae20-0db2c3c3a44b","Type":"ContainerStarted","Data":"0347262fd78f113a2c7df0a1ea29a098917a59ef60f38e82ab84666d24b1d422"} Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.177014 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.281408 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40e54865-5d25-47e2-ae20-0db2c3c3a44b-secret-volume\") pod \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.281803 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40e54865-5d25-47e2-ae20-0db2c3c3a44b-config-volume\") pod \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.281871 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsr9z\" (UniqueName: \"kubernetes.io/projected/40e54865-5d25-47e2-ae20-0db2c3c3a44b-kube-api-access-dsr9z\") pod \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\" (UID: \"40e54865-5d25-47e2-ae20-0db2c3c3a44b\") " Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.282495 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e54865-5d25-47e2-ae20-0db2c3c3a44b-config-volume" (OuterVolumeSpecName: "config-volume") pod "40e54865-5d25-47e2-ae20-0db2c3c3a44b" (UID: "40e54865-5d25-47e2-ae20-0db2c3c3a44b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.282848 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40e54865-5d25-47e2-ae20-0db2c3c3a44b-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.288767 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e54865-5d25-47e2-ae20-0db2c3c3a44b-kube-api-access-dsr9z" (OuterVolumeSpecName: "kube-api-access-dsr9z") pod "40e54865-5d25-47e2-ae20-0db2c3c3a44b" (UID: "40e54865-5d25-47e2-ae20-0db2c3c3a44b"). InnerVolumeSpecName "kube-api-access-dsr9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.290607 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e54865-5d25-47e2-ae20-0db2c3c3a44b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40e54865-5d25-47e2-ae20-0db2c3c3a44b" (UID: "40e54865-5d25-47e2-ae20-0db2c3c3a44b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.385078 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40e54865-5d25-47e2-ae20-0db2c3c3a44b-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.385118 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsr9z\" (UniqueName: \"kubernetes.io/projected/40e54865-5d25-47e2-ae20-0db2c3c3a44b-kube-api-access-dsr9z\") on node \"crc\" DevicePath \"\"" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.785146 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" event={"ID":"40e54865-5d25-47e2-ae20-0db2c3c3a44b","Type":"ContainerDied","Data":"0347262fd78f113a2c7df0a1ea29a098917a59ef60f38e82ab84666d24b1d422"} Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.785194 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0347262fd78f113a2c7df0a1ea29a098917a59ef60f38e82ab84666d24b1d422" Sep 30 18:45:03 crc kubenswrapper[4797]: I0930 18:45:03.785233 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc" Sep 30 18:45:04 crc kubenswrapper[4797]: I0930 18:45:04.283063 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6"] Sep 30 18:45:04 crc kubenswrapper[4797]: I0930 18:45:04.295852 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2wfk6"] Sep 30 18:45:06 crc kubenswrapper[4797]: I0930 18:45:06.260271 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b99a78e-2107-425b-8f49-3ac3621ba170" path="/var/lib/kubelet/pods/2b99a78e-2107-425b-8f49-3ac3621ba170/volumes" Sep 30 18:45:09 crc kubenswrapper[4797]: I0930 18:45:09.238567 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:45:09 crc kubenswrapper[4797]: E0930 18:45:09.239050 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:45:17 crc kubenswrapper[4797]: I0930 18:45:17.414302 4797 scope.go:117] "RemoveContainer" containerID="cdad9f8cbad5d86fb53aa2fac76142f7966dffdbf5539e0521dc55c0a848474d" Sep 30 18:45:24 crc kubenswrapper[4797]: I0930 18:45:24.238867 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:45:24 crc kubenswrapper[4797]: E0930 18:45:24.239942 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:45:39 crc kubenswrapper[4797]: I0930 18:45:39.237817 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:45:39 crc kubenswrapper[4797]: E0930 18:45:39.238526 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:45:50 crc kubenswrapper[4797]: I0930 18:45:50.246096 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:45:50 crc kubenswrapper[4797]: E0930 18:45:50.247236 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.757834 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8mtvv"] Sep 30 18:45:59 crc kubenswrapper[4797]: E0930 18:45:59.758794 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e54865-5d25-47e2-ae20-0db2c3c3a44b" containerName="collect-profiles" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.758808 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e54865-5d25-47e2-ae20-0db2c3c3a44b" containerName="collect-profiles" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.759019 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e54865-5d25-47e2-ae20-0db2c3c3a44b" containerName="collect-profiles" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.760651 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.770929 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mtvv"] Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.841920 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-utilities\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.841979 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-catalog-content\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.842043 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2rf\" (UniqueName: \"kubernetes.io/projected/1d3a2d03-221b-457c-8ac1-ae26d231b469-kube-api-access-wr2rf\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.944509 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-utilities\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.944580 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-catalog-content\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.944652 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2rf\" (UniqueName: \"kubernetes.io/projected/1d3a2d03-221b-457c-8ac1-ae26d231b469-kube-api-access-wr2rf\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.945194 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-utilities\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.945226 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-catalog-content\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:45:59 crc kubenswrapper[4797]: I0930 18:45:59.971165 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2rf\" (UniqueName: \"kubernetes.io/projected/1d3a2d03-221b-457c-8ac1-ae26d231b469-kube-api-access-wr2rf\") pod \"certified-operators-8mtvv\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:00 crc kubenswrapper[4797]: I0930 18:46:00.089311 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:00 crc kubenswrapper[4797]: I0930 18:46:00.586119 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mtvv"] Sep 30 18:46:01 crc kubenswrapper[4797]: I0930 18:46:01.238711 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:46:01 crc kubenswrapper[4797]: E0930 18:46:01.239518 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:46:01 crc kubenswrapper[4797]: I0930 18:46:01.428621 4797 generic.go:334] "Generic (PLEG): container finished" podID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerID="5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6" exitCode=0 Sep 30 18:46:01 crc kubenswrapper[4797]: I0930 18:46:01.428667 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerDied","Data":"5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6"} Sep 30 18:46:01 crc kubenswrapper[4797]: I0930 18:46:01.428703 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerStarted","Data":"cb81620e5a727807437432f93452ab71046be7d393dfa133081460764708b0b7"} Sep 30 18:46:02 crc kubenswrapper[4797]: I0930 18:46:02.442190 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerStarted","Data":"7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32"} Sep 30 18:46:04 crc kubenswrapper[4797]: I0930 18:46:04.466190 4797 generic.go:334] "Generic (PLEG): container finished" podID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerID="7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32" exitCode=0 Sep 30 18:46:04 crc kubenswrapper[4797]: I0930 18:46:04.466316 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerDied","Data":"7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32"} Sep 30 18:46:05 crc kubenswrapper[4797]: I0930 18:46:05.484176 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerStarted","Data":"3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb"} Sep 30 18:46:05 crc kubenswrapper[4797]: I0930 18:46:05.516941 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8mtvv" podStartSLOduration=2.975697052 podStartE2EDuration="6.516916773s" podCreationTimestamp="2025-09-30 18:45:59 +0000 UTC" firstStartedPulling="2025-09-30 18:46:01.431524634 +0000 UTC m=+3811.954023872" lastFinishedPulling="2025-09-30 18:46:04.972744355 +0000 UTC m=+3815.495243593" observedRunningTime="2025-09-30 18:46:05.509532201 +0000 UTC m=+3816.032031439" watchObservedRunningTime="2025-09-30 18:46:05.516916773 +0000 UTC m=+3816.039416021" Sep 30 18:46:10 crc kubenswrapper[4797]: I0930 18:46:10.090345 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:10 crc kubenswrapper[4797]: I0930 18:46:10.090964 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:10 crc kubenswrapper[4797]: I0930 18:46:10.147979 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:10 crc kubenswrapper[4797]: I0930 18:46:10.578835 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:10 crc kubenswrapper[4797]: I0930 18:46:10.625459 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mtvv"] Sep 30 18:46:12 crc kubenswrapper[4797]: I0930 18:46:12.550493 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8mtvv" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="registry-server" containerID="cri-o://3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb" gracePeriod=2 Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.151783 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.322289 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-utilities\") pod \"1d3a2d03-221b-457c-8ac1-ae26d231b469\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.322426 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-catalog-content\") pod \"1d3a2d03-221b-457c-8ac1-ae26d231b469\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.322567 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2rf\" (UniqueName: \"kubernetes.io/projected/1d3a2d03-221b-457c-8ac1-ae26d231b469-kube-api-access-wr2rf\") pod \"1d3a2d03-221b-457c-8ac1-ae26d231b469\" (UID: \"1d3a2d03-221b-457c-8ac1-ae26d231b469\") " Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.323341 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-utilities" (OuterVolumeSpecName: "utilities") pod "1d3a2d03-221b-457c-8ac1-ae26d231b469" (UID: "1d3a2d03-221b-457c-8ac1-ae26d231b469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.329576 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3a2d03-221b-457c-8ac1-ae26d231b469-kube-api-access-wr2rf" (OuterVolumeSpecName: "kube-api-access-wr2rf") pod "1d3a2d03-221b-457c-8ac1-ae26d231b469" (UID: "1d3a2d03-221b-457c-8ac1-ae26d231b469"). InnerVolumeSpecName "kube-api-access-wr2rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.374345 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d3a2d03-221b-457c-8ac1-ae26d231b469" (UID: "1d3a2d03-221b-457c-8ac1-ae26d231b469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.424363 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2rf\" (UniqueName: \"kubernetes.io/projected/1d3a2d03-221b-457c-8ac1-ae26d231b469-kube-api-access-wr2rf\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.424690 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.424750 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3a2d03-221b-457c-8ac1-ae26d231b469-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.562400 4797 generic.go:334] "Generic (PLEG): container finished" podID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerID="3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb" exitCode=0 Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.562465 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerDied","Data":"3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb"} Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.562525 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mtvv" event={"ID":"1d3a2d03-221b-457c-8ac1-ae26d231b469","Type":"ContainerDied","Data":"cb81620e5a727807437432f93452ab71046be7d393dfa133081460764708b0b7"} Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.562530 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mtvv" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.562570 4797 scope.go:117] "RemoveContainer" containerID="3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.595842 4797 scope.go:117] "RemoveContainer" containerID="7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.611536 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mtvv"] Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.622236 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8mtvv"] Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.629564 4797 scope.go:117] "RemoveContainer" containerID="5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.682403 4797 scope.go:117] "RemoveContainer" containerID="3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb" Sep 30 18:46:13 crc kubenswrapper[4797]: E0930 18:46:13.682928 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb\": container with ID starting with 3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb not found: ID does not exist" containerID="3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.682974 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb"} err="failed to get container status \"3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb\": rpc error: code = NotFound desc = could not find container \"3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb\": container with ID starting with 3f2397cb47e4c4728860925308a8aad471269e430348512e5183756a084922bb not found: ID does not exist" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.683008 4797 scope.go:117] "RemoveContainer" containerID="7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32" Sep 30 18:46:13 crc kubenswrapper[4797]: E0930 18:46:13.683306 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32\": container with ID starting with 7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32 not found: ID does not exist" containerID="7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.683342 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32"} err="failed to get container status \"7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32\": rpc error: code = NotFound desc = could not find container \"7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32\": container with ID starting with 7d973e7b4e73430563d58bfb8c91e99f46979c0bc9ae412b7fc956a422c08c32 not found: ID does not exist" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.683369 4797 scope.go:117] "RemoveContainer" containerID="5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6" Sep 30 18:46:13 crc kubenswrapper[4797]: E0930 18:46:13.683930 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6\": container with ID starting with 5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6 not found: ID does not exist" containerID="5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6" Sep 30 18:46:13 crc kubenswrapper[4797]: I0930 18:46:13.683960 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6"} err="failed to get container status \"5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6\": rpc error: code = NotFound desc = could not find container \"5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6\": container with ID starting with 5ec2f88edbc524aed83083b588d1155a235376676d51b0b2e03ad3827f49d7a6 not found: ID does not exist" Sep 30 18:46:14 crc kubenswrapper[4797]: I0930 18:46:14.253819 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" path="/var/lib/kubelet/pods/1d3a2d03-221b-457c-8ac1-ae26d231b469/volumes" Sep 30 18:46:16 crc kubenswrapper[4797]: I0930 18:46:16.238543 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:46:16 crc kubenswrapper[4797]: E0930 18:46:16.239153 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:46:28 crc kubenswrapper[4797]: I0930 18:46:28.238107 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:46:28 crc kubenswrapper[4797]: E0930 18:46:28.239107 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:46:41 crc kubenswrapper[4797]: I0930 18:46:41.238429 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:46:41 crc kubenswrapper[4797]: E0930 18:46:41.239816 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:46:52 crc kubenswrapper[4797]: I0930 18:46:52.238428 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:46:52 crc kubenswrapper[4797]: E0930 18:46:52.240127 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:47:06 crc kubenswrapper[4797]: I0930 18:47:06.238402 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:47:06 crc kubenswrapper[4797]: E0930 18:47:06.239050 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.575918 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gdfrj"] Sep 30 18:47:16 crc kubenswrapper[4797]: E0930 18:47:16.578109 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="extract-content" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.578322 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="extract-content" Sep 30 18:47:16 crc kubenswrapper[4797]: E0930 18:47:16.578422 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="extract-utilities" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.578531 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="extract-utilities" Sep 30 18:47:16 crc kubenswrapper[4797]: E0930 18:47:16.578621 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="registry-server" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.578708 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="registry-server" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.579104 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3a2d03-221b-457c-8ac1-ae26d231b469" containerName="registry-server" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.581188 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.585747 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gdfrj"] Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.636473 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mfr\" (UniqueName: \"kubernetes.io/projected/ca0aff57-938b-4f5f-a735-c34789fffc6d-kube-api-access-k8mfr\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.636601 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-utilities\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.636810 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-catalog-content\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.739640 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-catalog-content\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.740072 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mfr\" (UniqueName: \"kubernetes.io/projected/ca0aff57-938b-4f5f-a735-c34789fffc6d-kube-api-access-k8mfr\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.740266 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-catalog-content\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.740508 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-utilities\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.740849 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-utilities\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.758754 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mfr\" (UniqueName: \"kubernetes.io/projected/ca0aff57-938b-4f5f-a735-c34789fffc6d-kube-api-access-k8mfr\") pod \"redhat-operators-gdfrj\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:16 crc kubenswrapper[4797]: I0930 18:47:16.915411 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:17 crc kubenswrapper[4797]: I0930 18:47:17.407820 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gdfrj"] Sep 30 18:47:18 crc kubenswrapper[4797]: I0930 18:47:18.238476 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:47:18 crc kubenswrapper[4797]: E0930 18:47:18.239283 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:47:18 crc kubenswrapper[4797]: I0930 18:47:18.366720 4797 generic.go:334] "Generic (PLEG): container finished" podID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerID="1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46" exitCode=0 Sep 30 18:47:18 crc kubenswrapper[4797]: I0930 18:47:18.366763 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerDied","Data":"1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46"} Sep 30 18:47:18 crc kubenswrapper[4797]: I0930 18:47:18.366788 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerStarted","Data":"0f984239187343e15b01a3038f51bc32a7a759a82bfedb53c065ba95579a5856"} Sep 30 18:47:18 crc kubenswrapper[4797]: I0930 18:47:18.370380 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:47:20 crc kubenswrapper[4797]: I0930 18:47:20.390675 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerStarted","Data":"9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419"} Sep 30 18:47:23 crc kubenswrapper[4797]: I0930 18:47:23.440678 4797 generic.go:334] "Generic (PLEG): container finished" podID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerID="9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419" exitCode=0 Sep 30 18:47:23 crc kubenswrapper[4797]: I0930 18:47:23.440798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerDied","Data":"9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419"} Sep 30 18:47:24 crc kubenswrapper[4797]: I0930 18:47:24.452307 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerStarted","Data":"85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03"} Sep 30 18:47:24 crc kubenswrapper[4797]: I0930 18:47:24.473399 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gdfrj" podStartSLOduration=2.8991751040000002 podStartE2EDuration="8.473375293s" podCreationTimestamp="2025-09-30 18:47:16 +0000 UTC" firstStartedPulling="2025-09-30 18:47:18.370168291 +0000 UTC m=+3888.892667529" lastFinishedPulling="2025-09-30 18:47:23.94436844 +0000 UTC m=+3894.466867718" observedRunningTime="2025-09-30 18:47:24.471571364 +0000 UTC m=+3894.994070632" watchObservedRunningTime="2025-09-30 18:47:24.473375293 +0000 UTC m=+3894.995874541" Sep 30 18:47:26 crc kubenswrapper[4797]: I0930 18:47:26.916178 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:26 crc kubenswrapper[4797]: I0930 18:47:26.916596 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:27 crc kubenswrapper[4797]: I0930 18:47:27.967854 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gdfrj" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="registry-server" probeResult="failure" output=< Sep 30 18:47:27 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 18:47:27 crc kubenswrapper[4797]: > Sep 30 18:47:31 crc kubenswrapper[4797]: I0930 18:47:31.238346 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:47:31 crc kubenswrapper[4797]: E0930 18:47:31.239310 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:47:36 crc kubenswrapper[4797]: I0930 18:47:36.998325 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:37 crc kubenswrapper[4797]: I0930 18:47:37.079182 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:37 crc kubenswrapper[4797]: I0930 18:47:37.249084 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gdfrj"] Sep 30 18:47:38 crc kubenswrapper[4797]: I0930 18:47:38.587353 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gdfrj" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="registry-server" containerID="cri-o://85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03" gracePeriod=2 Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.183819 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.305057 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-utilities\") pod \"ca0aff57-938b-4f5f-a735-c34789fffc6d\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.305130 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-catalog-content\") pod \"ca0aff57-938b-4f5f-a735-c34789fffc6d\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.305364 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8mfr\" (UniqueName: \"kubernetes.io/projected/ca0aff57-938b-4f5f-a735-c34789fffc6d-kube-api-access-k8mfr\") pod \"ca0aff57-938b-4f5f-a735-c34789fffc6d\" (UID: \"ca0aff57-938b-4f5f-a735-c34789fffc6d\") " Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.306096 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-utilities" (OuterVolumeSpecName: "utilities") pod "ca0aff57-938b-4f5f-a735-c34789fffc6d" (UID: "ca0aff57-938b-4f5f-a735-c34789fffc6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.306337 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.311042 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0aff57-938b-4f5f-a735-c34789fffc6d-kube-api-access-k8mfr" (OuterVolumeSpecName: "kube-api-access-k8mfr") pod "ca0aff57-938b-4f5f-a735-c34789fffc6d" (UID: "ca0aff57-938b-4f5f-a735-c34789fffc6d"). InnerVolumeSpecName "kube-api-access-k8mfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.405580 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca0aff57-938b-4f5f-a735-c34789fffc6d" (UID: "ca0aff57-938b-4f5f-a735-c34789fffc6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.408266 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0aff57-938b-4f5f-a735-c34789fffc6d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.408289 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8mfr\" (UniqueName: \"kubernetes.io/projected/ca0aff57-938b-4f5f-a735-c34789fffc6d-kube-api-access-k8mfr\") on node \"crc\" DevicePath \"\"" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.603356 4797 generic.go:334] "Generic (PLEG): container finished" podID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerID="85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03" exitCode=0 Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.603470 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerDied","Data":"85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03"} Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.603498 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gdfrj" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.604797 4797 scope.go:117] "RemoveContainer" containerID="85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.604774 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gdfrj" event={"ID":"ca0aff57-938b-4f5f-a735-c34789fffc6d","Type":"ContainerDied","Data":"0f984239187343e15b01a3038f51bc32a7a759a82bfedb53c065ba95579a5856"} Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.645582 4797 scope.go:117] "RemoveContainer" containerID="9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.681582 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gdfrj"] Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.690318 4797 scope.go:117] "RemoveContainer" containerID="1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.698531 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gdfrj"] Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.735321 4797 scope.go:117] "RemoveContainer" containerID="85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03" Sep 30 18:47:39 crc kubenswrapper[4797]: E0930 18:47:39.735809 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03\": container with ID starting with 85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03 not found: ID does not exist" containerID="85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.735876 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03"} err="failed to get container status \"85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03\": rpc error: code = NotFound desc = could not find container \"85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03\": container with ID starting with 85afc6f053429aa216938b89be0ad74ce1945ad7671c253c4fdaddb208c49c03 not found: ID does not exist" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.735909 4797 scope.go:117] "RemoveContainer" containerID="9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419" Sep 30 18:47:39 crc kubenswrapper[4797]: E0930 18:47:39.736361 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419\": container with ID starting with 9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419 not found: ID does not exist" containerID="9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.736398 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419"} err="failed to get container status \"9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419\": rpc error: code = NotFound desc = could not find container \"9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419\": container with ID starting with 9be7098c50f9412ac082833c9cd92c3152d5d87b77d48b96a209d67074662419 not found: ID does not exist" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.736424 4797 scope.go:117] "RemoveContainer" containerID="1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46" Sep 30 18:47:39 crc kubenswrapper[4797]: E0930 18:47:39.736717 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46\": container with ID starting with 1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46 not found: ID does not exist" containerID="1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46" Sep 30 18:47:39 crc kubenswrapper[4797]: I0930 18:47:39.736743 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46"} err="failed to get container status \"1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46\": rpc error: code = NotFound desc = could not find container \"1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46\": container with ID starting with 1e8d17b84fe877c2f5bb0892dc442e69f6990e62b855f1977659a6ba1ba1fe46 not found: ID does not exist" Sep 30 18:47:40 crc kubenswrapper[4797]: I0930 18:47:40.256759 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" path="/var/lib/kubelet/pods/ca0aff57-938b-4f5f-a735-c34789fffc6d/volumes" Sep 30 18:47:43 crc kubenswrapper[4797]: I0930 18:47:43.238820 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:47:43 crc kubenswrapper[4797]: E0930 18:47:43.239742 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:47:54 crc kubenswrapper[4797]: I0930 18:47:54.238917 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:47:54 crc kubenswrapper[4797]: E0930 18:47:54.240039 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:48:07 crc kubenswrapper[4797]: I0930 18:48:07.238289 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:48:07 crc kubenswrapper[4797]: E0930 18:48:07.238976 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:48:19 crc kubenswrapper[4797]: I0930 18:48:19.238098 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:48:20 crc kubenswrapper[4797]: I0930 18:48:20.087346 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"1d1e3181f3b970f71387b6e810c28b980b70a8b3fd359d8a9ffad37ab9997ad5"} Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.528336 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kscjl"] Sep 30 18:50:30 crc kubenswrapper[4797]: E0930 18:50:30.529115 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="registry-server" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.529127 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="registry-server" Sep 30 18:50:30 crc kubenswrapper[4797]: E0930 18:50:30.529149 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="extract-utilities" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.529155 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="extract-utilities" Sep 30 18:50:30 crc kubenswrapper[4797]: E0930 18:50:30.529179 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="extract-content" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.529185 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="extract-content" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.529457 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0aff57-938b-4f5f-a735-c34789fffc6d" containerName="registry-server" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.531117 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.556706 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kscjl"] Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.663124 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-utilities\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.663657 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghjb8\" (UniqueName: \"kubernetes.io/projected/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-kube-api-access-ghjb8\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.663702 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-catalog-content\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.766197 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-utilities\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.766327 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghjb8\" (UniqueName: \"kubernetes.io/projected/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-kube-api-access-ghjb8\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.766361 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-catalog-content\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.766801 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-utilities\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.766922 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-catalog-content\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.792798 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghjb8\" (UniqueName: \"kubernetes.io/projected/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-kube-api-access-ghjb8\") pod \"community-operators-kscjl\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:30 crc kubenswrapper[4797]: I0930 18:50:30.854742 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:31 crc kubenswrapper[4797]: I0930 18:50:31.412026 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kscjl"] Sep 30 18:50:31 crc kubenswrapper[4797]: I0930 18:50:31.535718 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerStarted","Data":"f64c15ef80788b0cc4b3434ea27eaa85f27f3737a9b21dc271325fdbd9ad864e"} Sep 30 18:50:32 crc kubenswrapper[4797]: I0930 18:50:32.548948 4797 generic.go:334] "Generic (PLEG): container finished" podID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerID="baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf" exitCode=0 Sep 30 18:50:32 crc kubenswrapper[4797]: I0930 18:50:32.549087 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerDied","Data":"baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf"} Sep 30 18:50:34 crc kubenswrapper[4797]: I0930 18:50:34.571840 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerStarted","Data":"16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900"} Sep 30 18:50:35 crc kubenswrapper[4797]: I0930 18:50:35.589086 4797 generic.go:334] "Generic (PLEG): container finished" podID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerID="16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900" exitCode=0 Sep 30 18:50:35 crc kubenswrapper[4797]: I0930 18:50:35.589183 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerDied","Data":"16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900"} Sep 30 18:50:36 crc kubenswrapper[4797]: I0930 18:50:36.601066 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerStarted","Data":"078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9"} Sep 30 18:50:36 crc kubenswrapper[4797]: I0930 18:50:36.627990 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kscjl" podStartSLOduration=3.0446364 podStartE2EDuration="6.627970335s" podCreationTimestamp="2025-09-30 18:50:30 +0000 UTC" firstStartedPulling="2025-09-30 18:50:32.551081268 +0000 UTC m=+4083.073580536" lastFinishedPulling="2025-09-30 18:50:36.134415193 +0000 UTC m=+4086.656914471" observedRunningTime="2025-09-30 18:50:36.623712659 +0000 UTC m=+4087.146211927" watchObservedRunningTime="2025-09-30 18:50:36.627970335 +0000 UTC m=+4087.150469583" Sep 30 18:50:40 crc kubenswrapper[4797]: I0930 18:50:40.854901 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:40 crc kubenswrapper[4797]: I0930 18:50:40.855418 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:40 crc kubenswrapper[4797]: I0930 18:50:40.922778 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:41 crc kubenswrapper[4797]: I0930 18:50:41.717810 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:41 crc kubenswrapper[4797]: I0930 18:50:41.769032 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kscjl"] Sep 30 18:50:43 crc kubenswrapper[4797]: I0930 18:50:43.680686 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kscjl" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="registry-server" containerID="cri-o://078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9" gracePeriod=2 Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.191752 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.192136 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.243765 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.368042 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghjb8\" (UniqueName: \"kubernetes.io/projected/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-kube-api-access-ghjb8\") pod \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.369252 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-utilities" (OuterVolumeSpecName: "utilities") pod "c2866e30-3d53-4a09-a2e6-e9fefa778ad6" (UID: "c2866e30-3d53-4a09-a2e6-e9fefa778ad6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.369899 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-utilities\") pod \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.370179 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-catalog-content\") pod \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\" (UID: \"c2866e30-3d53-4a09-a2e6-e9fefa778ad6\") " Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.371091 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.382331 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-kube-api-access-ghjb8" (OuterVolumeSpecName: "kube-api-access-ghjb8") pod "c2866e30-3d53-4a09-a2e6-e9fefa778ad6" (UID: "c2866e30-3d53-4a09-a2e6-e9fefa778ad6"). InnerVolumeSpecName "kube-api-access-ghjb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.423801 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2866e30-3d53-4a09-a2e6-e9fefa778ad6" (UID: "c2866e30-3d53-4a09-a2e6-e9fefa778ad6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.473002 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.473046 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghjb8\" (UniqueName: \"kubernetes.io/projected/c2866e30-3d53-4a09-a2e6-e9fefa778ad6-kube-api-access-ghjb8\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.692869 4797 generic.go:334] "Generic (PLEG): container finished" podID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerID="078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9" exitCode=0 Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.692906 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerDied","Data":"078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9"} Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.692931 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kscjl" event={"ID":"c2866e30-3d53-4a09-a2e6-e9fefa778ad6","Type":"ContainerDied","Data":"f64c15ef80788b0cc4b3434ea27eaa85f27f3737a9b21dc271325fdbd9ad864e"} Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.692946 4797 scope.go:117] "RemoveContainer" containerID="078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.693059 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kscjl" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.730637 4797 scope.go:117] "RemoveContainer" containerID="16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.733871 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kscjl"] Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.744092 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kscjl"] Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.756106 4797 scope.go:117] "RemoveContainer" containerID="baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.819365 4797 scope.go:117] "RemoveContainer" containerID="078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9" Sep 30 18:50:44 crc kubenswrapper[4797]: E0930 18:50:44.819744 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9\": container with ID starting with 078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9 not found: ID does not exist" containerID="078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.819781 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9"} err="failed to get container status \"078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9\": rpc error: code = NotFound desc = could not find container \"078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9\": container with ID starting with 078ff20d4943df5f9fb7ea4b3bc72cdccc3ca48012314f01581eb4ce69ae49c9 not found: ID does not exist" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.819825 4797 scope.go:117] "RemoveContainer" containerID="16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900" Sep 30 18:50:44 crc kubenswrapper[4797]: E0930 18:50:44.820037 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900\": container with ID starting with 16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900 not found: ID does not exist" containerID="16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.820068 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900"} err="failed to get container status \"16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900\": rpc error: code = NotFound desc = could not find container \"16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900\": container with ID starting with 16c6db670b955aef34d5cd0d5988cfc38e4c7d19a6263adfca2435ad99a44900 not found: ID does not exist" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.820081 4797 scope.go:117] "RemoveContainer" containerID="baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf" Sep 30 18:50:44 crc kubenswrapper[4797]: E0930 18:50:44.820249 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf\": container with ID starting with baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf not found: ID does not exist" containerID="baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf" Sep 30 18:50:44 crc kubenswrapper[4797]: I0930 18:50:44.820265 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf"} err="failed to get container status \"baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf\": rpc error: code = NotFound desc = could not find container \"baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf\": container with ID starting with baf92132e6a9318d56a9119272ae461a12356ed7866f3adb3c90af1a685946bf not found: ID does not exist" Sep 30 18:50:46 crc kubenswrapper[4797]: I0930 18:50:46.248169 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" path="/var/lib/kubelet/pods/c2866e30-3d53-4a09-a2e6-e9fefa778ad6/volumes" Sep 30 18:51:14 crc kubenswrapper[4797]: I0930 18:51:14.193598 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:51:14 crc kubenswrapper[4797]: I0930 18:51:14.194309 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.192763 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.193535 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.193611 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.194840 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d1e3181f3b970f71387b6e810c28b980b70a8b3fd359d8a9ffad37ab9997ad5"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.194954 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://1d1e3181f3b970f71387b6e810c28b980b70a8b3fd359d8a9ffad37ab9997ad5" gracePeriod=600 Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.351111 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="1d1e3181f3b970f71387b6e810c28b980b70a8b3fd359d8a9ffad37ab9997ad5" exitCode=0 Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.351248 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"1d1e3181f3b970f71387b6e810c28b980b70a8b3fd359d8a9ffad37ab9997ad5"} Sep 30 18:51:44 crc kubenswrapper[4797]: I0930 18:51:44.351594 4797 scope.go:117] "RemoveContainer" containerID="4e74c2e743118595c2fc49270ba7b17c2669141034f8e297406940116a103125" Sep 30 18:51:45 crc kubenswrapper[4797]: I0930 18:51:45.366970 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce"} Sep 30 18:53:44 crc kubenswrapper[4797]: I0930 18:53:44.192500 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:53:44 crc kubenswrapper[4797]: I0930 18:53:44.193162 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:54:14 crc kubenswrapper[4797]: I0930 18:54:14.192242 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:54:14 crc kubenswrapper[4797]: I0930 18:54:14.192786 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.192920 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.193670 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.193735 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.194799 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.194884 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" gracePeriod=600 Sep 30 18:54:44 crc kubenswrapper[4797]: E0930 18:54:44.332489 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.393780 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" exitCode=0 Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.393821 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce"} Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.393853 4797 scope.go:117] "RemoveContainer" containerID="1d1e3181f3b970f71387b6e810c28b980b70a8b3fd359d8a9ffad37ab9997ad5" Sep 30 18:54:44 crc kubenswrapper[4797]: I0930 18:54:44.394466 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:54:44 crc kubenswrapper[4797]: E0930 18:54:44.394687 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:54:56 crc kubenswrapper[4797]: I0930 18:54:56.239746 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:54:56 crc kubenswrapper[4797]: E0930 18:54:56.241033 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:55:07 crc kubenswrapper[4797]: I0930 18:55:07.237748 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:55:07 crc kubenswrapper[4797]: E0930 18:55:07.238366 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:55:19 crc kubenswrapper[4797]: I0930 18:55:19.238586 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:55:19 crc kubenswrapper[4797]: E0930 18:55:19.239707 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:55:31 crc kubenswrapper[4797]: I0930 18:55:31.239369 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:55:31 crc kubenswrapper[4797]: E0930 18:55:31.240556 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:55:46 crc kubenswrapper[4797]: I0930 18:55:46.239846 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:55:46 crc kubenswrapper[4797]: E0930 18:55:46.240883 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:56:00 crc kubenswrapper[4797]: I0930 18:56:00.255847 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:56:00 crc kubenswrapper[4797]: E0930 18:56:00.256918 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:56:11 crc kubenswrapper[4797]: I0930 18:56:11.238407 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:56:11 crc kubenswrapper[4797]: E0930 18:56:11.239344 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.508495 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zv2wf"] Sep 30 18:56:21 crc kubenswrapper[4797]: E0930 18:56:21.511419 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="extract-utilities" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.511471 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="extract-utilities" Sep 30 18:56:21 crc kubenswrapper[4797]: E0930 18:56:21.511511 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="registry-server" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.511524 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="registry-server" Sep 30 18:56:21 crc kubenswrapper[4797]: E0930 18:56:21.511561 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="extract-content" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.511573 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="extract-content" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.512053 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2866e30-3d53-4a09-a2e6-e9fefa778ad6" containerName="registry-server" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.514702 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.524111 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zv2wf"] Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.594742 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-catalog-content\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.595099 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnfr9\" (UniqueName: \"kubernetes.io/projected/e7fd54b7-b798-4c5e-aae6-9501a25165d6-kube-api-access-bnfr9\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.595514 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-utilities\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.697568 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnfr9\" (UniqueName: \"kubernetes.io/projected/e7fd54b7-b798-4c5e-aae6-9501a25165d6-kube-api-access-bnfr9\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.697689 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-utilities\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.697784 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-catalog-content\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.698163 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-utilities\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.698287 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-catalog-content\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.716137 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnfr9\" (UniqueName: \"kubernetes.io/projected/e7fd54b7-b798-4c5e-aae6-9501a25165d6-kube-api-access-bnfr9\") pod \"certified-operators-zv2wf\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:21 crc kubenswrapper[4797]: I0930 18:56:21.850237 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:22 crc kubenswrapper[4797]: I0930 18:56:22.378071 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zv2wf"] Sep 30 18:56:22 crc kubenswrapper[4797]: I0930 18:56:22.483498 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerStarted","Data":"9805e1ca360e45ef654701c18dd6805762492776fe06aaf6906d536aa06cefd8"} Sep 30 18:56:23 crc kubenswrapper[4797]: I0930 18:56:23.499815 4797 generic.go:334] "Generic (PLEG): container finished" podID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerID="812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29" exitCode=0 Sep 30 18:56:23 crc kubenswrapper[4797]: I0930 18:56:23.500029 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerDied","Data":"812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29"} Sep 30 18:56:23 crc kubenswrapper[4797]: I0930 18:56:23.506002 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:56:25 crc kubenswrapper[4797]: I0930 18:56:25.239379 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:56:25 crc kubenswrapper[4797]: E0930 18:56:25.240484 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:56:25 crc kubenswrapper[4797]: I0930 18:56:25.528197 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerStarted","Data":"609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de"} Sep 30 18:56:26 crc kubenswrapper[4797]: I0930 18:56:26.541953 4797 generic.go:334] "Generic (PLEG): container finished" podID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerID="609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de" exitCode=0 Sep 30 18:56:26 crc kubenswrapper[4797]: I0930 18:56:26.542012 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerDied","Data":"609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de"} Sep 30 18:56:27 crc kubenswrapper[4797]: I0930 18:56:27.554285 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerStarted","Data":"2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1"} Sep 30 18:56:27 crc kubenswrapper[4797]: I0930 18:56:27.577457 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zv2wf" podStartSLOduration=3.074995898 podStartE2EDuration="6.577423113s" podCreationTimestamp="2025-09-30 18:56:21 +0000 UTC" firstStartedPulling="2025-09-30 18:56:23.505276265 +0000 UTC m=+4434.027775543" lastFinishedPulling="2025-09-30 18:56:27.00770347 +0000 UTC m=+4437.530202758" observedRunningTime="2025-09-30 18:56:27.571025488 +0000 UTC m=+4438.093524746" watchObservedRunningTime="2025-09-30 18:56:27.577423113 +0000 UTC m=+4438.099922351" Sep 30 18:56:31 crc kubenswrapper[4797]: I0930 18:56:31.850933 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:31 crc kubenswrapper[4797]: I0930 18:56:31.851540 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:31 crc kubenswrapper[4797]: I0930 18:56:31.906623 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:32 crc kubenswrapper[4797]: I0930 18:56:32.668416 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:32 crc kubenswrapper[4797]: I0930 18:56:32.727354 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zv2wf"] Sep 30 18:56:34 crc kubenswrapper[4797]: I0930 18:56:34.629960 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zv2wf" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="registry-server" containerID="cri-o://2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1" gracePeriod=2 Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.315832 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.412497 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-catalog-content\") pod \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.413191 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnfr9\" (UniqueName: \"kubernetes.io/projected/e7fd54b7-b798-4c5e-aae6-9501a25165d6-kube-api-access-bnfr9\") pod \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.413361 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-utilities\") pod \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\" (UID: \"e7fd54b7-b798-4c5e-aae6-9501a25165d6\") " Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.414607 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-utilities" (OuterVolumeSpecName: "utilities") pod "e7fd54b7-b798-4c5e-aae6-9501a25165d6" (UID: "e7fd54b7-b798-4c5e-aae6-9501a25165d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.421370 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fd54b7-b798-4c5e-aae6-9501a25165d6-kube-api-access-bnfr9" (OuterVolumeSpecName: "kube-api-access-bnfr9") pod "e7fd54b7-b798-4c5e-aae6-9501a25165d6" (UID: "e7fd54b7-b798-4c5e-aae6-9501a25165d6"). InnerVolumeSpecName "kube-api-access-bnfr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.473853 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7fd54b7-b798-4c5e-aae6-9501a25165d6" (UID: "e7fd54b7-b798-4c5e-aae6-9501a25165d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.516195 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnfr9\" (UniqueName: \"kubernetes.io/projected/e7fd54b7-b798-4c5e-aae6-9501a25165d6-kube-api-access-bnfr9\") on node \"crc\" DevicePath \"\"" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.516525 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.516609 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd54b7-b798-4c5e-aae6-9501a25165d6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.640565 4797 generic.go:334] "Generic (PLEG): container finished" podID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerID="2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1" exitCode=0 Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.640617 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerDied","Data":"2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1"} Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.640647 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zv2wf" event={"ID":"e7fd54b7-b798-4c5e-aae6-9501a25165d6","Type":"ContainerDied","Data":"9805e1ca360e45ef654701c18dd6805762492776fe06aaf6906d536aa06cefd8"} Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.640669 4797 scope.go:117] "RemoveContainer" containerID="2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.640720 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zv2wf" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.691546 4797 scope.go:117] "RemoveContainer" containerID="609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.697232 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zv2wf"] Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.707144 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zv2wf"] Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.715889 4797 scope.go:117] "RemoveContainer" containerID="812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.772949 4797 scope.go:117] "RemoveContainer" containerID="2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1" Sep 30 18:56:35 crc kubenswrapper[4797]: E0930 18:56:35.773698 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1\": container with ID starting with 2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1 not found: ID does not exist" containerID="2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.773737 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1"} err="failed to get container status \"2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1\": rpc error: code = NotFound desc = could not find container \"2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1\": container with ID starting with 2fdb54c2b9d8ff7e375a48b1484f445b977a874606bf62485ce76980637409a1 not found: ID does not exist" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.773762 4797 scope.go:117] "RemoveContainer" containerID="609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de" Sep 30 18:56:35 crc kubenswrapper[4797]: E0930 18:56:35.774532 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de\": container with ID starting with 609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de not found: ID does not exist" containerID="609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.774569 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de"} err="failed to get container status \"609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de\": rpc error: code = NotFound desc = could not find container \"609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de\": container with ID starting with 609a659436142b050801d42085b167ef2ed6ef18160b5c4bfff1f32fe9ad24de not found: ID does not exist" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.774590 4797 scope.go:117] "RemoveContainer" containerID="812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29" Sep 30 18:56:35 crc kubenswrapper[4797]: E0930 18:56:35.774863 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29\": container with ID starting with 812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29 not found: ID does not exist" containerID="812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29" Sep 30 18:56:35 crc kubenswrapper[4797]: I0930 18:56:35.774897 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29"} err="failed to get container status \"812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29\": rpc error: code = NotFound desc = could not find container \"812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29\": container with ID starting with 812042a96949dad1b1eb9cdfadb69687906c0bdd28a979986e72fcafc718fe29 not found: ID does not exist" Sep 30 18:56:36 crc kubenswrapper[4797]: I0930 18:56:36.239386 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:56:36 crc kubenswrapper[4797]: E0930 18:56:36.240477 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:56:36 crc kubenswrapper[4797]: I0930 18:56:36.255499 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" path="/var/lib/kubelet/pods/e7fd54b7-b798-4c5e-aae6-9501a25165d6/volumes" Sep 30 18:56:48 crc kubenswrapper[4797]: I0930 18:56:48.239000 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:56:48 crc kubenswrapper[4797]: E0930 18:56:48.239859 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:57:01 crc kubenswrapper[4797]: I0930 18:57:01.238347 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:57:01 crc kubenswrapper[4797]: E0930 18:57:01.239424 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:57:15 crc kubenswrapper[4797]: I0930 18:57:15.238370 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:57:15 crc kubenswrapper[4797]: E0930 18:57:15.239615 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:57:27 crc kubenswrapper[4797]: I0930 18:57:27.238370 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:57:27 crc kubenswrapper[4797]: E0930 18:57:27.239627 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:57:39 crc kubenswrapper[4797]: I0930 18:57:39.239376 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:57:39 crc kubenswrapper[4797]: E0930 18:57:39.240093 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:57:54 crc kubenswrapper[4797]: I0930 18:57:54.238254 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:57:54 crc kubenswrapper[4797]: E0930 18:57:54.240006 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:58:05 crc kubenswrapper[4797]: I0930 18:58:05.238260 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:58:05 crc kubenswrapper[4797]: E0930 18:58:05.239035 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.557896 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tlhp"] Sep 30 18:58:06 crc kubenswrapper[4797]: E0930 18:58:06.558621 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="extract-content" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.558638 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="extract-content" Sep 30 18:58:06 crc kubenswrapper[4797]: E0930 18:58:06.558667 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="extract-utilities" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.558675 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="extract-utilities" Sep 30 18:58:06 crc kubenswrapper[4797]: E0930 18:58:06.558710 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="registry-server" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.558718 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="registry-server" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.558973 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fd54b7-b798-4c5e-aae6-9501a25165d6" containerName="registry-server" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.560794 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.572418 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tlhp"] Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.578960 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-utilities\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.579263 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-catalog-content\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.579407 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rds\" (UniqueName: \"kubernetes.io/projected/3df23d0e-71c6-463a-9121-35fd6be5989d-kube-api-access-j2rds\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.681705 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-utilities\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.681833 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-catalog-content\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.681862 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rds\" (UniqueName: \"kubernetes.io/projected/3df23d0e-71c6-463a-9121-35fd6be5989d-kube-api-access-j2rds\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.682397 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-utilities\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.682521 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-catalog-content\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.781426 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rds\" (UniqueName: \"kubernetes.io/projected/3df23d0e-71c6-463a-9121-35fd6be5989d-kube-api-access-j2rds\") pod \"redhat-operators-5tlhp\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:06 crc kubenswrapper[4797]: I0930 18:58:06.922363 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:07 crc kubenswrapper[4797]: I0930 18:58:07.436409 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tlhp"] Sep 30 18:58:07 crc kubenswrapper[4797]: I0930 18:58:07.685939 4797 generic.go:334] "Generic (PLEG): container finished" podID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerID="eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05" exitCode=0 Sep 30 18:58:07 crc kubenswrapper[4797]: I0930 18:58:07.686141 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerDied","Data":"eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05"} Sep 30 18:58:07 crc kubenswrapper[4797]: I0930 18:58:07.686207 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerStarted","Data":"f66f484da42ebda9477145df1a398544d2854b99ee3e87f712e5cd89b12f28c7"} Sep 30 18:58:09 crc kubenswrapper[4797]: I0930 18:58:09.705888 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerStarted","Data":"857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28"} Sep 30 18:58:14 crc kubenswrapper[4797]: I0930 18:58:14.768172 4797 generic.go:334] "Generic (PLEG): container finished" podID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerID="857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28" exitCode=0 Sep 30 18:58:14 crc kubenswrapper[4797]: I0930 18:58:14.768270 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerDied","Data":"857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28"} Sep 30 18:58:15 crc kubenswrapper[4797]: I0930 18:58:15.781950 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerStarted","Data":"a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f"} Sep 30 18:58:15 crc kubenswrapper[4797]: I0930 18:58:15.805741 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tlhp" podStartSLOduration=2.236207227 podStartE2EDuration="9.805723063s" podCreationTimestamp="2025-09-30 18:58:06 +0000 UTC" firstStartedPulling="2025-09-30 18:58:07.68751044 +0000 UTC m=+4538.210009678" lastFinishedPulling="2025-09-30 18:58:15.257026246 +0000 UTC m=+4545.779525514" observedRunningTime="2025-09-30 18:58:15.798922897 +0000 UTC m=+4546.321422135" watchObservedRunningTime="2025-09-30 18:58:15.805723063 +0000 UTC m=+4546.328222291" Sep 30 18:58:16 crc kubenswrapper[4797]: I0930 18:58:16.923672 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:16 crc kubenswrapper[4797]: I0930 18:58:16.924052 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:17 crc kubenswrapper[4797]: I0930 18:58:17.977573 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tlhp" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="registry-server" probeResult="failure" output=< Sep 30 18:58:17 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 18:58:17 crc kubenswrapper[4797]: > Sep 30 18:58:18 crc kubenswrapper[4797]: I0930 18:58:18.238312 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:58:18 crc kubenswrapper[4797]: E0930 18:58:18.238585 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:58:27 crc kubenswrapper[4797]: I0930 18:58:26.999975 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:27 crc kubenswrapper[4797]: I0930 18:58:27.067250 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:27 crc kubenswrapper[4797]: I0930 18:58:27.245738 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tlhp"] Sep 30 18:58:28 crc kubenswrapper[4797]: I0930 18:58:28.941113 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tlhp" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="registry-server" containerID="cri-o://a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f" gracePeriod=2 Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.239118 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:58:29 crc kubenswrapper[4797]: E0930 18:58:29.239403 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.825537 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.868148 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-catalog-content\") pod \"3df23d0e-71c6-463a-9121-35fd6be5989d\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.868249 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-utilities\") pod \"3df23d0e-71c6-463a-9121-35fd6be5989d\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.868410 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2rds\" (UniqueName: \"kubernetes.io/projected/3df23d0e-71c6-463a-9121-35fd6be5989d-kube-api-access-j2rds\") pod \"3df23d0e-71c6-463a-9121-35fd6be5989d\" (UID: \"3df23d0e-71c6-463a-9121-35fd6be5989d\") " Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.874455 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-utilities" (OuterVolumeSpecName: "utilities") pod "3df23d0e-71c6-463a-9121-35fd6be5989d" (UID: "3df23d0e-71c6-463a-9121-35fd6be5989d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.875012 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df23d0e-71c6-463a-9121-35fd6be5989d-kube-api-access-j2rds" (OuterVolumeSpecName: "kube-api-access-j2rds") pod "3df23d0e-71c6-463a-9121-35fd6be5989d" (UID: "3df23d0e-71c6-463a-9121-35fd6be5989d"). InnerVolumeSpecName "kube-api-access-j2rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.955794 4797 generic.go:334] "Generic (PLEG): container finished" podID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerID="a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f" exitCode=0 Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.955838 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerDied","Data":"a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f"} Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.955866 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tlhp" event={"ID":"3df23d0e-71c6-463a-9121-35fd6be5989d","Type":"ContainerDied","Data":"f66f484da42ebda9477145df1a398544d2854b99ee3e87f712e5cd89b12f28c7"} Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.955882 4797 scope.go:117] "RemoveContainer" containerID="a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.955880 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tlhp" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.965316 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3df23d0e-71c6-463a-9121-35fd6be5989d" (UID: "3df23d0e-71c6-463a-9121-35fd6be5989d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.970886 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2rds\" (UniqueName: \"kubernetes.io/projected/3df23d0e-71c6-463a-9121-35fd6be5989d-kube-api-access-j2rds\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.970933 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.970945 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df23d0e-71c6-463a-9121-35fd6be5989d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:29 crc kubenswrapper[4797]: I0930 18:58:29.979604 4797 scope.go:117] "RemoveContainer" containerID="857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.000739 4797 scope.go:117] "RemoveContainer" containerID="eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.039590 4797 scope.go:117] "RemoveContainer" containerID="a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f" Sep 30 18:58:30 crc kubenswrapper[4797]: E0930 18:58:30.040335 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f\": container with ID starting with a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f not found: ID does not exist" containerID="a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.040447 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f"} err="failed to get container status \"a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f\": rpc error: code = NotFound desc = could not find container \"a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f\": container with ID starting with a235eb01766318d53ed69ccf4fb75da8068ef716a3147fa13716380e18be8c5f not found: ID does not exist" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.040540 4797 scope.go:117] "RemoveContainer" containerID="857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28" Sep 30 18:58:30 crc kubenswrapper[4797]: E0930 18:58:30.041097 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28\": container with ID starting with 857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28 not found: ID does not exist" containerID="857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.041139 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28"} err="failed to get container status \"857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28\": rpc error: code = NotFound desc = could not find container \"857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28\": container with ID starting with 857ce67d021967ae9218c43d3a9305107b4e4e67e96acd9034fb526387646c28 not found: ID does not exist" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.041166 4797 scope.go:117] "RemoveContainer" containerID="eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05" Sep 30 18:58:30 crc kubenswrapper[4797]: E0930 18:58:30.043529 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05\": container with ID starting with eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05 not found: ID does not exist" containerID="eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.043561 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05"} err="failed to get container status \"eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05\": rpc error: code = NotFound desc = could not find container \"eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05\": container with ID starting with eac2caede9f41cef1cbbcb212239138770805ebeb703d04413c7f63ab925cb05 not found: ID does not exist" Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.292567 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tlhp"] Sep 30 18:58:30 crc kubenswrapper[4797]: I0930 18:58:30.303271 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tlhp"] Sep 30 18:58:32 crc kubenswrapper[4797]: I0930 18:58:32.252030 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" path="/var/lib/kubelet/pods/3df23d0e-71c6-463a-9121-35fd6be5989d/volumes" Sep 30 18:58:41 crc kubenswrapper[4797]: I0930 18:58:41.238912 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:58:41 crc kubenswrapper[4797]: E0930 18:58:41.240673 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:58:53 crc kubenswrapper[4797]: I0930 18:58:53.238846 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:58:53 crc kubenswrapper[4797]: E0930 18:58:53.239473 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:59:06 crc kubenswrapper[4797]: I0930 18:59:06.238970 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:59:06 crc kubenswrapper[4797]: E0930 18:59:06.239773 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:59:17 crc kubenswrapper[4797]: I0930 18:59:17.238060 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:59:17 crc kubenswrapper[4797]: E0930 18:59:17.239090 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:59:22 crc kubenswrapper[4797]: I0930 18:59:22.780137 4797 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-x2tc2" podUID="ccc6478c-07c2-431f-a964-1db62dd3800e" containerName="nmstate-handler" probeResult="failure" output="command timed out" Sep 30 18:59:29 crc kubenswrapper[4797]: I0930 18:59:29.146010 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-594ff6944c-p2jp5" podUID="89ca411e-ead4-4a2d-9eba-f3f8fffcad46" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 18:59:30 crc kubenswrapper[4797]: I0930 18:59:30.257593 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:59:30 crc kubenswrapper[4797]: E0930 18:59:30.258216 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 18:59:44 crc kubenswrapper[4797]: I0930 18:59:44.238380 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 18:59:44 crc kubenswrapper[4797]: I0930 18:59:44.875358 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"d5ffa52096ab3a183e0edef077fc22407d98c2ec95e5401e40acd34465e796a9"} Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.175382 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt"] Sep 30 19:00:00 crc kubenswrapper[4797]: E0930 19:00:00.176751 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="extract-utilities" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.176772 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="extract-utilities" Sep 30 19:00:00 crc kubenswrapper[4797]: E0930 19:00:00.176799 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="extract-content" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.176812 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="extract-content" Sep 30 19:00:00 crc kubenswrapper[4797]: E0930 19:00:00.176863 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="registry-server" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.176875 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="registry-server" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.177256 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df23d0e-71c6-463a-9121-35fd6be5989d" containerName="registry-server" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.178466 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.181565 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.181680 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.191857 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt"] Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.264487 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7wf\" (UniqueName: \"kubernetes.io/projected/f2606c51-04b1-4ed9-a193-9ff99b1af305-kube-api-access-rb7wf\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.264566 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2606c51-04b1-4ed9-a193-9ff99b1af305-secret-volume\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.264621 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2606c51-04b1-4ed9-a193-9ff99b1af305-config-volume\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.367149 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7wf\" (UniqueName: \"kubernetes.io/projected/f2606c51-04b1-4ed9-a193-9ff99b1af305-kube-api-access-rb7wf\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.367360 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2606c51-04b1-4ed9-a193-9ff99b1af305-secret-volume\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.367570 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2606c51-04b1-4ed9-a193-9ff99b1af305-config-volume\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.369457 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2606c51-04b1-4ed9-a193-9ff99b1af305-config-volume\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.377454 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2606c51-04b1-4ed9-a193-9ff99b1af305-secret-volume\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.387551 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7wf\" (UniqueName: \"kubernetes.io/projected/f2606c51-04b1-4ed9-a193-9ff99b1af305-kube-api-access-rb7wf\") pod \"collect-profiles-29320980-jgmmt\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.503039 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:00 crc kubenswrapper[4797]: I0930 19:00:00.991344 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt"] Sep 30 19:00:02 crc kubenswrapper[4797]: I0930 19:00:02.044380 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" event={"ID":"f2606c51-04b1-4ed9-a193-9ff99b1af305","Type":"ContainerStarted","Data":"4eb7a4a11b3a9789f6fa86a613df76f5391be74e4f0dda7d5506d294c29e38cf"} Sep 30 19:00:02 crc kubenswrapper[4797]: I0930 19:00:02.044425 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" event={"ID":"f2606c51-04b1-4ed9-a193-9ff99b1af305","Type":"ContainerStarted","Data":"f8fc8abf0c21865f64308dadbfd3ee8f74b7349dbf9d88e1d83b2f20a88a59cb"} Sep 30 19:00:02 crc kubenswrapper[4797]: I0930 19:00:02.074987 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" podStartSLOduration=2.074964812 podStartE2EDuration="2.074964812s" podCreationTimestamp="2025-09-30 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:00:02.068045443 +0000 UTC m=+4652.590544681" watchObservedRunningTime="2025-09-30 19:00:02.074964812 +0000 UTC m=+4652.597464050" Sep 30 19:00:03 crc kubenswrapper[4797]: I0930 19:00:03.056161 4797 generic.go:334] "Generic (PLEG): container finished" podID="f2606c51-04b1-4ed9-a193-9ff99b1af305" containerID="4eb7a4a11b3a9789f6fa86a613df76f5391be74e4f0dda7d5506d294c29e38cf" exitCode=0 Sep 30 19:00:03 crc kubenswrapper[4797]: I0930 19:00:03.056531 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" event={"ID":"f2606c51-04b1-4ed9-a193-9ff99b1af305","Type":"ContainerDied","Data":"4eb7a4a11b3a9789f6fa86a613df76f5391be74e4f0dda7d5506d294c29e38cf"} Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.579007 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.658237 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2606c51-04b1-4ed9-a193-9ff99b1af305-secret-volume\") pod \"f2606c51-04b1-4ed9-a193-9ff99b1af305\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.658517 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7wf\" (UniqueName: \"kubernetes.io/projected/f2606c51-04b1-4ed9-a193-9ff99b1af305-kube-api-access-rb7wf\") pod \"f2606c51-04b1-4ed9-a193-9ff99b1af305\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.658691 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2606c51-04b1-4ed9-a193-9ff99b1af305-config-volume\") pod \"f2606c51-04b1-4ed9-a193-9ff99b1af305\" (UID: \"f2606c51-04b1-4ed9-a193-9ff99b1af305\") " Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.659258 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2606c51-04b1-4ed9-a193-9ff99b1af305-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2606c51-04b1-4ed9-a193-9ff99b1af305" (UID: "f2606c51-04b1-4ed9-a193-9ff99b1af305"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.663637 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2606c51-04b1-4ed9-a193-9ff99b1af305-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2606c51-04b1-4ed9-a193-9ff99b1af305" (UID: "f2606c51-04b1-4ed9-a193-9ff99b1af305"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.664659 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2606c51-04b1-4ed9-a193-9ff99b1af305-kube-api-access-rb7wf" (OuterVolumeSpecName: "kube-api-access-rb7wf") pod "f2606c51-04b1-4ed9-a193-9ff99b1af305" (UID: "f2606c51-04b1-4ed9-a193-9ff99b1af305"). InnerVolumeSpecName "kube-api-access-rb7wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.760632 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2606c51-04b1-4ed9-a193-9ff99b1af305-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.760663 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb7wf\" (UniqueName: \"kubernetes.io/projected/f2606c51-04b1-4ed9-a193-9ff99b1af305-kube-api-access-rb7wf\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:04 crc kubenswrapper[4797]: I0930 19:00:04.760674 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2606c51-04b1-4ed9-a193-9ff99b1af305-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:05 crc kubenswrapper[4797]: I0930 19:00:05.074898 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" event={"ID":"f2606c51-04b1-4ed9-a193-9ff99b1af305","Type":"ContainerDied","Data":"f8fc8abf0c21865f64308dadbfd3ee8f74b7349dbf9d88e1d83b2f20a88a59cb"} Sep 30 19:00:05 crc kubenswrapper[4797]: I0930 19:00:05.074933 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8fc8abf0c21865f64308dadbfd3ee8f74b7349dbf9d88e1d83b2f20a88a59cb" Sep 30 19:00:05 crc kubenswrapper[4797]: I0930 19:00:05.075025 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-jgmmt" Sep 30 19:00:05 crc kubenswrapper[4797]: I0930 19:00:05.153623 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn"] Sep 30 19:00:05 crc kubenswrapper[4797]: I0930 19:00:05.164255 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320935-vgssn"] Sep 30 19:00:06 crc kubenswrapper[4797]: I0930 19:00:06.249608 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7d13b2-d1cb-4b8c-b677-7aaae221e38a" path="/var/lib/kubelet/pods/3f7d13b2-d1cb-4b8c-b677-7aaae221e38a/volumes" Sep 30 19:00:17 crc kubenswrapper[4797]: I0930 19:00:17.868564 4797 scope.go:117] "RemoveContainer" containerID="a644d5555d4d432224f1c0c9fac352b7efd8235bd626931c3959ac1471e2fbb7" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.578945 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k4256"] Sep 30 19:00:39 crc kubenswrapper[4797]: E0930 19:00:39.582649 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2606c51-04b1-4ed9-a193-9ff99b1af305" containerName="collect-profiles" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.582671 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2606c51-04b1-4ed9-a193-9ff99b1af305" containerName="collect-profiles" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.583031 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2606c51-04b1-4ed9-a193-9ff99b1af305" containerName="collect-profiles" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.585250 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.607522 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4256"] Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.703346 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbctv\" (UniqueName: \"kubernetes.io/projected/a339a79c-9a35-4778-b82a-5e9c44d19f9b-kube-api-access-nbctv\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.703423 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-catalog-content\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.703554 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-utilities\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.806689 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-utilities\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.807086 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbctv\" (UniqueName: \"kubernetes.io/projected/a339a79c-9a35-4778-b82a-5e9c44d19f9b-kube-api-access-nbctv\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.807155 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-catalog-content\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.807507 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-utilities\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.807810 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-catalog-content\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.828292 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbctv\" (UniqueName: \"kubernetes.io/projected/a339a79c-9a35-4778-b82a-5e9c44d19f9b-kube-api-access-nbctv\") pod \"community-operators-k4256\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:39 crc kubenswrapper[4797]: I0930 19:00:39.907260 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:40 crc kubenswrapper[4797]: I0930 19:00:40.394611 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4256"] Sep 30 19:00:40 crc kubenswrapper[4797]: I0930 19:00:40.470370 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerStarted","Data":"96efb5c7938cc0f342f7e9a7f73f64d846a277fcb18231c7aebff47d39a3184f"} Sep 30 19:00:41 crc kubenswrapper[4797]: I0930 19:00:41.483482 4797 generic.go:334] "Generic (PLEG): container finished" podID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerID="aadddc3ad3dd73747bf4c6f38b2c6986bc26a729eeff329648c27f3b6f7c3797" exitCode=0 Sep 30 19:00:41 crc kubenswrapper[4797]: I0930 19:00:41.483554 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerDied","Data":"aadddc3ad3dd73747bf4c6f38b2c6986bc26a729eeff329648c27f3b6f7c3797"} Sep 30 19:00:42 crc kubenswrapper[4797]: I0930 19:00:42.494933 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerStarted","Data":"ed261f37734fb5e3aafcd5a296c1bd54e3085dc8da2f3713a634f68f1fb20dd0"} Sep 30 19:00:43 crc kubenswrapper[4797]: I0930 19:00:43.511598 4797 generic.go:334] "Generic (PLEG): container finished" podID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerID="ed261f37734fb5e3aafcd5a296c1bd54e3085dc8da2f3713a634f68f1fb20dd0" exitCode=0 Sep 30 19:00:43 crc kubenswrapper[4797]: I0930 19:00:43.512046 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerDied","Data":"ed261f37734fb5e3aafcd5a296c1bd54e3085dc8da2f3713a634f68f1fb20dd0"} Sep 30 19:00:44 crc kubenswrapper[4797]: I0930 19:00:44.526169 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerStarted","Data":"1cae7802661f7b1ba54707e65b816f9290aa54108cf0c28e76881e91d3e2792f"} Sep 30 19:00:49 crc kubenswrapper[4797]: I0930 19:00:49.907615 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:49 crc kubenswrapper[4797]: I0930 19:00:49.908306 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:49 crc kubenswrapper[4797]: I0930 19:00:49.969722 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:49 crc kubenswrapper[4797]: I0930 19:00:49.998480 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k4256" podStartSLOduration=8.518962308999999 podStartE2EDuration="10.998455125s" podCreationTimestamp="2025-09-30 19:00:39 +0000 UTC" firstStartedPulling="2025-09-30 19:00:41.485234036 +0000 UTC m=+4692.007733294" lastFinishedPulling="2025-09-30 19:00:43.964726862 +0000 UTC m=+4694.487226110" observedRunningTime="2025-09-30 19:00:44.547031382 +0000 UTC m=+4695.069530640" watchObservedRunningTime="2025-09-30 19:00:49.998455125 +0000 UTC m=+4700.520954373" Sep 30 19:00:50 crc kubenswrapper[4797]: I0930 19:00:50.666217 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:50 crc kubenswrapper[4797]: I0930 19:00:50.718230 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4256"] Sep 30 19:00:52 crc kubenswrapper[4797]: I0930 19:00:52.615896 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k4256" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="registry-server" containerID="cri-o://1cae7802661f7b1ba54707e65b816f9290aa54108cf0c28e76881e91d3e2792f" gracePeriod=2 Sep 30 19:00:54 crc kubenswrapper[4797]: I0930 19:00:54.640062 4797 generic.go:334] "Generic (PLEG): container finished" podID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerID="1cae7802661f7b1ba54707e65b816f9290aa54108cf0c28e76881e91d3e2792f" exitCode=0 Sep 30 19:00:54 crc kubenswrapper[4797]: I0930 19:00:54.640140 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerDied","Data":"1cae7802661f7b1ba54707e65b816f9290aa54108cf0c28e76881e91d3e2792f"} Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.133915 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.256989 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbctv\" (UniqueName: \"kubernetes.io/projected/a339a79c-9a35-4778-b82a-5e9c44d19f9b-kube-api-access-nbctv\") pod \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.257171 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-utilities\") pod \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.257410 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-catalog-content\") pod \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\" (UID: \"a339a79c-9a35-4778-b82a-5e9c44d19f9b\") " Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.264530 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a339a79c-9a35-4778-b82a-5e9c44d19f9b-kube-api-access-nbctv" (OuterVolumeSpecName: "kube-api-access-nbctv") pod "a339a79c-9a35-4778-b82a-5e9c44d19f9b" (UID: "a339a79c-9a35-4778-b82a-5e9c44d19f9b"). InnerVolumeSpecName "kube-api-access-nbctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.265270 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-utilities" (OuterVolumeSpecName: "utilities") pod "a339a79c-9a35-4778-b82a-5e9c44d19f9b" (UID: "a339a79c-9a35-4778-b82a-5e9c44d19f9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.327199 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a339a79c-9a35-4778-b82a-5e9c44d19f9b" (UID: "a339a79c-9a35-4778-b82a-5e9c44d19f9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.360349 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.360413 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbctv\" (UniqueName: \"kubernetes.io/projected/a339a79c-9a35-4778-b82a-5e9c44d19f9b-kube-api-access-nbctv\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.360475 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a339a79c-9a35-4778-b82a-5e9c44d19f9b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.443483 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ckz2g"] Sep 30 19:00:55 crc kubenswrapper[4797]: E0930 19:00:55.444239 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="extract-content" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.444268 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="extract-content" Sep 30 19:00:55 crc kubenswrapper[4797]: E0930 19:00:55.444310 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="extract-utilities" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.444325 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="extract-utilities" Sep 30 19:00:55 crc kubenswrapper[4797]: E0930 19:00:55.444357 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="registry-server" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.444369 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="registry-server" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.444720 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" containerName="registry-server" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.447977 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.464899 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ckz2g"] Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.570547 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-catalog-content\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.571330 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxggz\" (UniqueName: \"kubernetes.io/projected/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-kube-api-access-kxggz\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.571530 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-utilities\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.651589 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4256" event={"ID":"a339a79c-9a35-4778-b82a-5e9c44d19f9b","Type":"ContainerDied","Data":"96efb5c7938cc0f342f7e9a7f73f64d846a277fcb18231c7aebff47d39a3184f"} Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.651641 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4256" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.651646 4797 scope.go:117] "RemoveContainer" containerID="1cae7802661f7b1ba54707e65b816f9290aa54108cf0c28e76881e91d3e2792f" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.672891 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-catalog-content\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.673002 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxggz\" (UniqueName: \"kubernetes.io/projected/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-kube-api-access-kxggz\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.673070 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-utilities\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.673578 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-utilities\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.673791 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-catalog-content\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.683567 4797 scope.go:117] "RemoveContainer" containerID="ed261f37734fb5e3aafcd5a296c1bd54e3085dc8da2f3713a634f68f1fb20dd0" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.687511 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4256"] Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.695801 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxggz\" (UniqueName: \"kubernetes.io/projected/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-kube-api-access-kxggz\") pod \"redhat-marketplace-ckz2g\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.730621 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k4256"] Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.733707 4797 scope.go:117] "RemoveContainer" containerID="aadddc3ad3dd73747bf4c6f38b2c6986bc26a729eeff329648c27f3b6f7c3797" Sep 30 19:00:55 crc kubenswrapper[4797]: I0930 19:00:55.797091 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:00:56 crc kubenswrapper[4797]: I0930 19:00:56.251065 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a339a79c-9a35-4778-b82a-5e9c44d19f9b" path="/var/lib/kubelet/pods/a339a79c-9a35-4778-b82a-5e9c44d19f9b/volumes" Sep 30 19:00:56 crc kubenswrapper[4797]: I0930 19:00:56.295281 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ckz2g"] Sep 30 19:00:56 crc kubenswrapper[4797]: W0930 19:00:56.300853 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd0d210_20ed_405e_a8f7_9e9f6df6e8b5.slice/crio-8011d6b400b03b74ffd873d3b763a860559725414c13180fb641ba1e92b67299 WatchSource:0}: Error finding container 8011d6b400b03b74ffd873d3b763a860559725414c13180fb641ba1e92b67299: Status 404 returned error can't find the container with id 8011d6b400b03b74ffd873d3b763a860559725414c13180fb641ba1e92b67299 Sep 30 19:00:56 crc kubenswrapper[4797]: I0930 19:00:56.662475 4797 generic.go:334] "Generic (PLEG): container finished" podID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerID="a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f" exitCode=0 Sep 30 19:00:56 crc kubenswrapper[4797]: I0930 19:00:56.662529 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ckz2g" event={"ID":"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5","Type":"ContainerDied","Data":"a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f"} Sep 30 19:00:56 crc kubenswrapper[4797]: I0930 19:00:56.662809 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ckz2g" event={"ID":"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5","Type":"ContainerStarted","Data":"8011d6b400b03b74ffd873d3b763a860559725414c13180fb641ba1e92b67299"} Sep 30 19:00:58 crc kubenswrapper[4797]: I0930 19:00:58.718150 4797 generic.go:334] "Generic (PLEG): container finished" podID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerID="b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c" exitCode=0 Sep 30 19:00:58 crc kubenswrapper[4797]: I0930 19:00:58.718688 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ckz2g" event={"ID":"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5","Type":"ContainerDied","Data":"b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c"} Sep 30 19:00:59 crc kubenswrapper[4797]: I0930 19:00:59.736976 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ckz2g" event={"ID":"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5","Type":"ContainerStarted","Data":"736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201"} Sep 30 19:00:59 crc kubenswrapper[4797]: I0930 19:00:59.763348 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ckz2g" podStartSLOduration=2.023006929 podStartE2EDuration="4.763320036s" podCreationTimestamp="2025-09-30 19:00:55 +0000 UTC" firstStartedPulling="2025-09-30 19:00:56.664421322 +0000 UTC m=+4707.186920560" lastFinishedPulling="2025-09-30 19:00:59.404734389 +0000 UTC m=+4709.927233667" observedRunningTime="2025-09-30 19:00:59.754156115 +0000 UTC m=+4710.276655443" watchObservedRunningTime="2025-09-30 19:00:59.763320036 +0000 UTC m=+4710.285819314" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.160496 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320981-sk9mv"] Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.163583 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.182668 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320981-sk9mv"] Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.184134 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-256r7\" (UniqueName: \"kubernetes.io/projected/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-kube-api-access-256r7\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.184246 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-fernet-keys\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.184331 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-combined-ca-bundle\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.184369 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-config-data\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.294780 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-config-data\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.302825 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-256r7\" (UniqueName: \"kubernetes.io/projected/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-kube-api-access-256r7\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.303053 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-fernet-keys\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.303193 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-combined-ca-bundle\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.313518 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-combined-ca-bundle\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.332101 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-256r7\" (UniqueName: \"kubernetes.io/projected/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-kube-api-access-256r7\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.343362 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-fernet-keys\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.351563 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-config-data\") pod \"keystone-cron-29320981-sk9mv\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.487178 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:00 crc kubenswrapper[4797]: W0930 19:01:00.981685 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90f2ee8c_9aaa_4cbc_bde8_25bc8a297045.slice/crio-ee2ceca71e0a59359b9e32d8428b39d1baf4e9ffe216290b9ada860588e3e1c9 WatchSource:0}: Error finding container ee2ceca71e0a59359b9e32d8428b39d1baf4e9ffe216290b9ada860588e3e1c9: Status 404 returned error can't find the container with id ee2ceca71e0a59359b9e32d8428b39d1baf4e9ffe216290b9ada860588e3e1c9 Sep 30 19:01:00 crc kubenswrapper[4797]: I0930 19:01:00.986036 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320981-sk9mv"] Sep 30 19:01:01 crc kubenswrapper[4797]: I0930 19:01:01.759103 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-sk9mv" event={"ID":"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045","Type":"ContainerStarted","Data":"d173be0b14596b3d737ddf0600895149b4f1ae6cd06261a59d11d98e11fb836c"} Sep 30 19:01:01 crc kubenswrapper[4797]: I0930 19:01:01.759971 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-sk9mv" event={"ID":"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045","Type":"ContainerStarted","Data":"ee2ceca71e0a59359b9e32d8428b39d1baf4e9ffe216290b9ada860588e3e1c9"} Sep 30 19:01:01 crc kubenswrapper[4797]: I0930 19:01:01.784531 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320981-sk9mv" podStartSLOduration=1.784510568 podStartE2EDuration="1.784510568s" podCreationTimestamp="2025-09-30 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:01:01.771947252 +0000 UTC m=+4712.294446500" watchObservedRunningTime="2025-09-30 19:01:01.784510568 +0000 UTC m=+4712.307009826" Sep 30 19:01:03 crc kubenswrapper[4797]: I0930 19:01:03.780691 4797 generic.go:334] "Generic (PLEG): container finished" podID="90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" containerID="d173be0b14596b3d737ddf0600895149b4f1ae6cd06261a59d11d98e11fb836c" exitCode=0 Sep 30 19:01:03 crc kubenswrapper[4797]: I0930 19:01:03.780775 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-sk9mv" event={"ID":"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045","Type":"ContainerDied","Data":"d173be0b14596b3d737ddf0600895149b4f1ae6cd06261a59d11d98e11fb836c"} Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.391887 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.415530 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-256r7\" (UniqueName: \"kubernetes.io/projected/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-kube-api-access-256r7\") pod \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.415687 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-fernet-keys\") pod \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.415844 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-combined-ca-bundle\") pod \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.415900 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-config-data\") pod \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\" (UID: \"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045\") " Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.422791 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-kube-api-access-256r7" (OuterVolumeSpecName: "kube-api-access-256r7") pod "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" (UID: "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045"). InnerVolumeSpecName "kube-api-access-256r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.423623 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" (UID: "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.462875 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" (UID: "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.481347 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-config-data" (OuterVolumeSpecName: "config-data") pod "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" (UID: "90f2ee8c-9aaa-4cbc-bde8-25bc8a297045"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.517650 4797 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.517687 4797 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.517698 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.517707 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-256r7\" (UniqueName: \"kubernetes.io/projected/90f2ee8c-9aaa-4cbc-bde8-25bc8a297045-kube-api-access-256r7\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.797759 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.797837 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.800771 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320981-sk9mv" event={"ID":"90f2ee8c-9aaa-4cbc-bde8-25bc8a297045","Type":"ContainerDied","Data":"ee2ceca71e0a59359b9e32d8428b39d1baf4e9ffe216290b9ada860588e3e1c9"} Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.800810 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320981-sk9mv" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.800834 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2ceca71e0a59359b9e32d8428b39d1baf4e9ffe216290b9ada860588e3e1c9" Sep 30 19:01:05 crc kubenswrapper[4797]: I0930 19:01:05.885534 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:01:07 crc kubenswrapper[4797]: I0930 19:01:07.736589 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:01:07 crc kubenswrapper[4797]: I0930 19:01:07.797756 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ckz2g"] Sep 30 19:01:08 crc kubenswrapper[4797]: I0930 19:01:08.840456 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ckz2g" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="registry-server" containerID="cri-o://736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201" gracePeriod=2 Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.423469 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.611629 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-utilities\") pod \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.611758 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-catalog-content\") pod \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.611925 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxggz\" (UniqueName: \"kubernetes.io/projected/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-kube-api-access-kxggz\") pod \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\" (UID: \"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5\") " Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.613256 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-utilities" (OuterVolumeSpecName: "utilities") pod "3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" (UID: "3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.629072 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" (UID: "3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.630766 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-kube-api-access-kxggz" (OuterVolumeSpecName: "kube-api-access-kxggz") pod "3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" (UID: "3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5"). InnerVolumeSpecName "kube-api-access-kxggz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.715137 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxggz\" (UniqueName: \"kubernetes.io/projected/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-kube-api-access-kxggz\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.715179 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.715192 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.856221 4797 generic.go:334] "Generic (PLEG): container finished" podID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerID="736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201" exitCode=0 Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.856284 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ckz2g" event={"ID":"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5","Type":"ContainerDied","Data":"736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201"} Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.856345 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ckz2g" event={"ID":"3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5","Type":"ContainerDied","Data":"8011d6b400b03b74ffd873d3b763a860559725414c13180fb641ba1e92b67299"} Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.856348 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ckz2g" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.856379 4797 scope.go:117] "RemoveContainer" containerID="736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.887727 4797 scope.go:117] "RemoveContainer" containerID="b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.909962 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ckz2g"] Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.926227 4797 scope.go:117] "RemoveContainer" containerID="a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.928613 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ckz2g"] Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.973303 4797 scope.go:117] "RemoveContainer" containerID="736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201" Sep 30 19:01:09 crc kubenswrapper[4797]: E0930 19:01:09.974174 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201\": container with ID starting with 736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201 not found: ID does not exist" containerID="736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.974245 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201"} err="failed to get container status \"736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201\": rpc error: code = NotFound desc = could not find container \"736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201\": container with ID starting with 736d4c0b387287047b053906725037bfb3e14b0d43e9015b8e4b3b373d6bf201 not found: ID does not exist" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.974294 4797 scope.go:117] "RemoveContainer" containerID="b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c" Sep 30 19:01:09 crc kubenswrapper[4797]: E0930 19:01:09.974806 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c\": container with ID starting with b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c not found: ID does not exist" containerID="b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.974850 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c"} err="failed to get container status \"b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c\": rpc error: code = NotFound desc = could not find container \"b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c\": container with ID starting with b55665d96e309cef9ae22a74062221fd9bf65c991f668069e6bd09e31177502c not found: ID does not exist" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.974876 4797 scope.go:117] "RemoveContainer" containerID="a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f" Sep 30 19:01:09 crc kubenswrapper[4797]: E0930 19:01:09.975081 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f\": container with ID starting with a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f not found: ID does not exist" containerID="a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f" Sep 30 19:01:09 crc kubenswrapper[4797]: I0930 19:01:09.975106 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f"} err="failed to get container status \"a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f\": rpc error: code = NotFound desc = could not find container \"a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f\": container with ID starting with a13936f60e227ad3aa48bf12ac7b2052e12f61773b605c4192e626c2fc50942f not found: ID does not exist" Sep 30 19:01:10 crc kubenswrapper[4797]: I0930 19:01:10.248866 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" path="/var/lib/kubelet/pods/3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5/volumes" Sep 30 19:01:44 crc kubenswrapper[4797]: I0930 19:01:44.191942 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:01:44 crc kubenswrapper[4797]: I0930 19:01:44.192882 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:02:14 crc kubenswrapper[4797]: I0930 19:02:14.191639 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:02:14 crc kubenswrapper[4797]: I0930 19:02:14.192294 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.191719 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.192413 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.192562 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.193572 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5ffa52096ab3a183e0edef077fc22407d98c2ec95e5401e40acd34465e796a9"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.193637 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://d5ffa52096ab3a183e0edef077fc22407d98c2ec95e5401e40acd34465e796a9" gracePeriod=600 Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.834568 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="d5ffa52096ab3a183e0edef077fc22407d98c2ec95e5401e40acd34465e796a9" exitCode=0 Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.834652 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"d5ffa52096ab3a183e0edef077fc22407d98c2ec95e5401e40acd34465e796a9"} Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.834839 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72"} Sep 30 19:02:44 crc kubenswrapper[4797]: I0930 19:02:44.834861 4797 scope.go:117] "RemoveContainer" containerID="0b5b1bba288a80dacda565b4684b819184ba4835e0395e086f652071a1b12dce" Sep 30 19:04:44 crc kubenswrapper[4797]: I0930 19:04:44.192189 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:04:44 crc kubenswrapper[4797]: I0930 19:04:44.193137 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:05:14 crc kubenswrapper[4797]: I0930 19:05:14.191400 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:05:14 crc kubenswrapper[4797]: I0930 19:05:14.192016 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.191864 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.192457 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.192519 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.193453 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.193536 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" gracePeriod=600 Sep 30 19:05:44 crc kubenswrapper[4797]: E0930 19:05:44.326283 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.957772 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" exitCode=0 Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.957859 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72"} Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.958357 4797 scope.go:117] "RemoveContainer" containerID="d5ffa52096ab3a183e0edef077fc22407d98c2ec95e5401e40acd34465e796a9" Sep 30 19:05:44 crc kubenswrapper[4797]: I0930 19:05:44.958978 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:05:44 crc kubenswrapper[4797]: E0930 19:05:44.959386 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:05:56 crc kubenswrapper[4797]: I0930 19:05:56.238615 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:05:56 crc kubenswrapper[4797]: E0930 19:05:56.239385 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:06:08 crc kubenswrapper[4797]: I0930 19:06:08.239736 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:06:08 crc kubenswrapper[4797]: E0930 19:06:08.240568 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:06:22 crc kubenswrapper[4797]: I0930 19:06:22.238727 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:06:22 crc kubenswrapper[4797]: E0930 19:06:22.239662 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:06:33 crc kubenswrapper[4797]: I0930 19:06:33.238872 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:06:33 crc kubenswrapper[4797]: E0930 19:06:33.239648 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:06:45 crc kubenswrapper[4797]: I0930 19:06:45.238814 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:06:45 crc kubenswrapper[4797]: E0930 19:06:45.239896 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:07:00 crc kubenswrapper[4797]: I0930 19:07:00.249712 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:07:00 crc kubenswrapper[4797]: E0930 19:07:00.251131 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:07:13 crc kubenswrapper[4797]: I0930 19:07:13.239711 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:07:13 crc kubenswrapper[4797]: E0930 19:07:13.241123 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:07:26 crc kubenswrapper[4797]: I0930 19:07:26.238386 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:07:26 crc kubenswrapper[4797]: E0930 19:07:26.239666 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:07:39 crc kubenswrapper[4797]: I0930 19:07:39.238266 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:07:39 crc kubenswrapper[4797]: E0930 19:07:39.239084 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:07:54 crc kubenswrapper[4797]: I0930 19:07:54.238576 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:07:54 crc kubenswrapper[4797]: E0930 19:07:54.239505 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:08:07 crc kubenswrapper[4797]: I0930 19:08:07.239104 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:08:07 crc kubenswrapper[4797]: E0930 19:08:07.240240 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:08:21 crc kubenswrapper[4797]: I0930 19:08:21.238560 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:08:21 crc kubenswrapper[4797]: E0930 19:08:21.239323 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:08:34 crc kubenswrapper[4797]: I0930 19:08:34.239591 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:08:34 crc kubenswrapper[4797]: E0930 19:08:34.241111 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:08:46 crc kubenswrapper[4797]: I0930 19:08:46.238837 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:08:46 crc kubenswrapper[4797]: E0930 19:08:46.240228 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:09:00 crc kubenswrapper[4797]: I0930 19:09:00.243908 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:09:00 crc kubenswrapper[4797]: E0930 19:09:00.244924 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:09:11 crc kubenswrapper[4797]: I0930 19:09:11.238460 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:09:11 crc kubenswrapper[4797]: E0930 19:09:11.239202 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.146384 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9cw8r"] Sep 30 19:09:13 crc kubenswrapper[4797]: E0930 19:09:13.147163 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="extract-utilities" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.147352 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="extract-utilities" Sep 30 19:09:13 crc kubenswrapper[4797]: E0930 19:09:13.147410 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="extract-content" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.147419 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="extract-content" Sep 30 19:09:13 crc kubenswrapper[4797]: E0930 19:09:13.147463 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="registry-server" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.147472 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="registry-server" Sep 30 19:09:13 crc kubenswrapper[4797]: E0930 19:09:13.147486 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" containerName="keystone-cron" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.147494 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" containerName="keystone-cron" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.147746 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f2ee8c-9aaa-4cbc-bde8-25bc8a297045" containerName="keystone-cron" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.147775 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd0d210-20ed-405e-a8f7-9e9f6df6e8b5" containerName="registry-server" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.150335 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.167529 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cw8r"] Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.259667 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-catalog-content\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.260007 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-utilities\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.260144 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb9lh\" (UniqueName: \"kubernetes.io/projected/2dd8b975-aaf6-454d-b437-d316a010d68d-kube-api-access-nb9lh\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.363015 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb9lh\" (UniqueName: \"kubernetes.io/projected/2dd8b975-aaf6-454d-b437-d316a010d68d-kube-api-access-nb9lh\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.364145 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-catalog-content\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.364332 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-utilities\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.364792 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-catalog-content\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.364799 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-utilities\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.387170 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb9lh\" (UniqueName: \"kubernetes.io/projected/2dd8b975-aaf6-454d-b437-d316a010d68d-kube-api-access-nb9lh\") pod \"redhat-operators-9cw8r\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.480323 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:13 crc kubenswrapper[4797]: I0930 19:09:13.974804 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cw8r"] Sep 30 19:09:15 crc kubenswrapper[4797]: I0930 19:09:15.271295 4797 generic.go:334] "Generic (PLEG): container finished" podID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerID="794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4" exitCode=0 Sep 30 19:09:15 crc kubenswrapper[4797]: I0930 19:09:15.271532 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerDied","Data":"794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4"} Sep 30 19:09:15 crc kubenswrapper[4797]: I0930 19:09:15.271907 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerStarted","Data":"b41c4c2320ab56e2ea0245488a76327459c969031150a0ea1733171709edfe64"} Sep 30 19:09:15 crc kubenswrapper[4797]: I0930 19:09:15.290201 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:09:17 crc kubenswrapper[4797]: I0930 19:09:17.295117 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerStarted","Data":"9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd"} Sep 30 19:09:19 crc kubenswrapper[4797]: I0930 19:09:19.318402 4797 generic.go:334] "Generic (PLEG): container finished" podID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerID="9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd" exitCode=0 Sep 30 19:09:19 crc kubenswrapper[4797]: I0930 19:09:19.318505 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerDied","Data":"9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd"} Sep 30 19:09:21 crc kubenswrapper[4797]: I0930 19:09:21.351329 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerStarted","Data":"4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864"} Sep 30 19:09:21 crc kubenswrapper[4797]: I0930 19:09:21.377605 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9cw8r" podStartSLOduration=3.1968236230000002 podStartE2EDuration="8.377584585s" podCreationTimestamp="2025-09-30 19:09:13 +0000 UTC" firstStartedPulling="2025-09-30 19:09:15.278570398 +0000 UTC m=+5205.801069646" lastFinishedPulling="2025-09-30 19:09:20.45933135 +0000 UTC m=+5210.981830608" observedRunningTime="2025-09-30 19:09:21.376480635 +0000 UTC m=+5211.898979873" watchObservedRunningTime="2025-09-30 19:09:21.377584585 +0000 UTC m=+5211.900083843" Sep 30 19:09:23 crc kubenswrapper[4797]: I0930 19:09:23.238801 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:09:23 crc kubenswrapper[4797]: E0930 19:09:23.240048 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:09:23 crc kubenswrapper[4797]: I0930 19:09:23.480461 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:23 crc kubenswrapper[4797]: I0930 19:09:23.480512 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:24 crc kubenswrapper[4797]: I0930 19:09:24.540588 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9cw8r" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="registry-server" probeResult="failure" output=< Sep 30 19:09:24 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 19:09:24 crc kubenswrapper[4797]: > Sep 30 19:09:33 crc kubenswrapper[4797]: I0930 19:09:33.557302 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:33 crc kubenswrapper[4797]: I0930 19:09:33.636587 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:33 crc kubenswrapper[4797]: I0930 19:09:33.801380 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9cw8r"] Sep 30 19:09:35 crc kubenswrapper[4797]: I0930 19:09:35.502428 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9cw8r" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="registry-server" containerID="cri-o://4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864" gracePeriod=2 Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.091708 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.145768 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-catalog-content\") pod \"2dd8b975-aaf6-454d-b437-d316a010d68d\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.146053 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-utilities\") pod \"2dd8b975-aaf6-454d-b437-d316a010d68d\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.146177 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb9lh\" (UniqueName: \"kubernetes.io/projected/2dd8b975-aaf6-454d-b437-d316a010d68d-kube-api-access-nb9lh\") pod \"2dd8b975-aaf6-454d-b437-d316a010d68d\" (UID: \"2dd8b975-aaf6-454d-b437-d316a010d68d\") " Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.148511 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-utilities" (OuterVolumeSpecName: "utilities") pod "2dd8b975-aaf6-454d-b437-d316a010d68d" (UID: "2dd8b975-aaf6-454d-b437-d316a010d68d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.150456 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.153128 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd8b975-aaf6-454d-b437-d316a010d68d-kube-api-access-nb9lh" (OuterVolumeSpecName: "kube-api-access-nb9lh") pod "2dd8b975-aaf6-454d-b437-d316a010d68d" (UID: "2dd8b975-aaf6-454d-b437-d316a010d68d"). InnerVolumeSpecName "kube-api-access-nb9lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.238888 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:09:36 crc kubenswrapper[4797]: E0930 19:09:36.239326 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.248135 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dd8b975-aaf6-454d-b437-d316a010d68d" (UID: "2dd8b975-aaf6-454d-b437-d316a010d68d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.251942 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb9lh\" (UniqueName: \"kubernetes.io/projected/2dd8b975-aaf6-454d-b437-d316a010d68d-kube-api-access-nb9lh\") on node \"crc\" DevicePath \"\"" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.251981 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd8b975-aaf6-454d-b437-d316a010d68d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.518421 4797 generic.go:334] "Generic (PLEG): container finished" podID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerID="4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864" exitCode=0 Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.518551 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cw8r" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.519600 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerDied","Data":"4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864"} Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.519681 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cw8r" event={"ID":"2dd8b975-aaf6-454d-b437-d316a010d68d","Type":"ContainerDied","Data":"b41c4c2320ab56e2ea0245488a76327459c969031150a0ea1733171709edfe64"} Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.519712 4797 scope.go:117] "RemoveContainer" containerID="4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.545348 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9cw8r"] Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.547171 4797 scope.go:117] "RemoveContainer" containerID="9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.558403 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9cw8r"] Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.585845 4797 scope.go:117] "RemoveContainer" containerID="794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.647968 4797 scope.go:117] "RemoveContainer" containerID="4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864" Sep 30 19:09:36 crc kubenswrapper[4797]: E0930 19:09:36.649364 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864\": container with ID starting with 4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864 not found: ID does not exist" containerID="4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.649413 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864"} err="failed to get container status \"4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864\": rpc error: code = NotFound desc = could not find container \"4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864\": container with ID starting with 4a7d697032010ad05a6fca4b0a9be7883b4d0162c94ad69476a7ade660029864 not found: ID does not exist" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.649469 4797 scope.go:117] "RemoveContainer" containerID="9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd" Sep 30 19:09:36 crc kubenswrapper[4797]: E0930 19:09:36.649872 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd\": container with ID starting with 9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd not found: ID does not exist" containerID="9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.649896 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd"} err="failed to get container status \"9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd\": rpc error: code = NotFound desc = could not find container \"9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd\": container with ID starting with 9633b2dd6ab59ef582c8919dde9301ee7c2c0580588df652a65cc8760167d8cd not found: ID does not exist" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.649936 4797 scope.go:117] "RemoveContainer" containerID="794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4" Sep 30 19:09:36 crc kubenswrapper[4797]: E0930 19:09:36.650204 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4\": container with ID starting with 794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4 not found: ID does not exist" containerID="794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4" Sep 30 19:09:36 crc kubenswrapper[4797]: I0930 19:09:36.650233 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4"} err="failed to get container status \"794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4\": rpc error: code = NotFound desc = could not find container \"794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4\": container with ID starting with 794ed2e003fd708275be4cc9e2c91e674c4267cca3da64c089865089839800d4 not found: ID does not exist" Sep 30 19:09:38 crc kubenswrapper[4797]: I0930 19:09:38.252170 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" path="/var/lib/kubelet/pods/2dd8b975-aaf6-454d-b437-d316a010d68d/volumes" Sep 30 19:09:51 crc kubenswrapper[4797]: I0930 19:09:51.238171 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:09:51 crc kubenswrapper[4797]: E0930 19:09:51.239111 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:10:05 crc kubenswrapper[4797]: I0930 19:10:05.238966 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:10:05 crc kubenswrapper[4797]: E0930 19:10:05.239841 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:10:19 crc kubenswrapper[4797]: I0930 19:10:19.238114 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:10:19 crc kubenswrapper[4797]: E0930 19:10:19.238906 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:10:31 crc kubenswrapper[4797]: I0930 19:10:31.238543 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:10:31 crc kubenswrapper[4797]: E0930 19:10:31.239318 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:10:43 crc kubenswrapper[4797]: I0930 19:10:43.238135 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:10:43 crc kubenswrapper[4797]: E0930 19:10:43.238958 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:10:55 crc kubenswrapper[4797]: I0930 19:10:55.239676 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:10:56 crc kubenswrapper[4797]: I0930 19:10:56.375192 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"9cdd310b4fd09a9aecb11c790651e043eb641b08219e5af25e9d86de8ef127d8"} Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.390046 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2tq7"] Sep 30 19:10:58 crc kubenswrapper[4797]: E0930 19:10:58.390698 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="extract-content" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.390711 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="extract-content" Sep 30 19:10:58 crc kubenswrapper[4797]: E0930 19:10:58.390739 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="extract-utilities" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.390746 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="extract-utilities" Sep 30 19:10:58 crc kubenswrapper[4797]: E0930 19:10:58.390761 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="registry-server" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.390767 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="registry-server" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.390940 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd8b975-aaf6-454d-b437-d316a010d68d" containerName="registry-server" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.392276 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.410337 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2tq7"] Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.487888 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5kwl\" (UniqueName: \"kubernetes.io/projected/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-kube-api-access-q5kwl\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.487982 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-catalog-content\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.488151 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-utilities\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.590424 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5kwl\" (UniqueName: \"kubernetes.io/projected/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-kube-api-access-q5kwl\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.590502 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-catalog-content\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.590572 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-utilities\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.591266 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-utilities\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.591266 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-catalog-content\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:58 crc kubenswrapper[4797]: I0930 19:10:58.880298 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5kwl\" (UniqueName: \"kubernetes.io/projected/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-kube-api-access-q5kwl\") pod \"community-operators-l2tq7\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:59 crc kubenswrapper[4797]: I0930 19:10:59.015823 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:10:59 crc kubenswrapper[4797]: I0930 19:10:59.460495 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2tq7"] Sep 30 19:11:00 crc kubenswrapper[4797]: I0930 19:11:00.414780 4797 generic.go:334] "Generic (PLEG): container finished" podID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerID="f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5" exitCode=0 Sep 30 19:11:00 crc kubenswrapper[4797]: I0930 19:11:00.414987 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerDied","Data":"f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5"} Sep 30 19:11:00 crc kubenswrapper[4797]: I0930 19:11:00.415048 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerStarted","Data":"cc87bc66225486715bd19a80c271b0d8aa7e0fa4a9a1f1b89da6f8c48aa96816"} Sep 30 19:11:01 crc kubenswrapper[4797]: I0930 19:11:01.427929 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerStarted","Data":"defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4"} Sep 30 19:11:02 crc kubenswrapper[4797]: I0930 19:11:02.444152 4797 generic.go:334] "Generic (PLEG): container finished" podID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerID="defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4" exitCode=0 Sep 30 19:11:02 crc kubenswrapper[4797]: I0930 19:11:02.444241 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerDied","Data":"defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4"} Sep 30 19:11:03 crc kubenswrapper[4797]: I0930 19:11:03.455965 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerStarted","Data":"ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41"} Sep 30 19:11:03 crc kubenswrapper[4797]: I0930 19:11:03.474604 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2tq7" podStartSLOduration=2.90949496 podStartE2EDuration="5.474584955s" podCreationTimestamp="2025-09-30 19:10:58 +0000 UTC" firstStartedPulling="2025-09-30 19:11:00.41741558 +0000 UTC m=+5310.939914818" lastFinishedPulling="2025-09-30 19:11:02.982505565 +0000 UTC m=+5313.505004813" observedRunningTime="2025-09-30 19:11:03.473508615 +0000 UTC m=+5313.996007863" watchObservedRunningTime="2025-09-30 19:11:03.474584955 +0000 UTC m=+5313.997084193" Sep 30 19:11:09 crc kubenswrapper[4797]: I0930 19:11:09.016652 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:11:09 crc kubenswrapper[4797]: I0930 19:11:09.017276 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:11:09 crc kubenswrapper[4797]: I0930 19:11:09.087221 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:11:10 crc kubenswrapper[4797]: I0930 19:11:10.257071 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:11:10 crc kubenswrapper[4797]: I0930 19:11:10.315540 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2tq7"] Sep 30 19:11:11 crc kubenswrapper[4797]: I0930 19:11:11.537606 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2tq7" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="registry-server" containerID="cri-o://ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41" gracePeriod=2 Sep 30 19:11:11 crc kubenswrapper[4797]: I0930 19:11:11.987108 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.168346 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5kwl\" (UniqueName: \"kubernetes.io/projected/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-kube-api-access-q5kwl\") pod \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.168419 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-utilities\") pod \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.168582 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-catalog-content\") pod \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\" (UID: \"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc\") " Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.170789 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-utilities" (OuterVolumeSpecName: "utilities") pod "c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" (UID: "c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.175936 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-kube-api-access-q5kwl" (OuterVolumeSpecName: "kube-api-access-q5kwl") pod "c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" (UID: "c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc"). InnerVolumeSpecName "kube-api-access-q5kwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.229059 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" (UID: "c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.270888 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5kwl\" (UniqueName: \"kubernetes.io/projected/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-kube-api-access-q5kwl\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.270931 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.270949 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.552671 4797 generic.go:334] "Generic (PLEG): container finished" podID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerID="ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41" exitCode=0 Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.552735 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerDied","Data":"ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41"} Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.552772 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2tq7" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.552804 4797 scope.go:117] "RemoveContainer" containerID="ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.552787 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2tq7" event={"ID":"c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc","Type":"ContainerDied","Data":"cc87bc66225486715bd19a80c271b0d8aa7e0fa4a9a1f1b89da6f8c48aa96816"} Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.590632 4797 scope.go:117] "RemoveContainer" containerID="defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.597980 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2tq7"] Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.608693 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2tq7"] Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.622731 4797 scope.go:117] "RemoveContainer" containerID="f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.670384 4797 scope.go:117] "RemoveContainer" containerID="ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41" Sep 30 19:11:12 crc kubenswrapper[4797]: E0930 19:11:12.670855 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41\": container with ID starting with ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41 not found: ID does not exist" containerID="ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.670925 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41"} err="failed to get container status \"ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41\": rpc error: code = NotFound desc = could not find container \"ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41\": container with ID starting with ce9d429746413963186a940aa7a3c6bd0466c84aa47452d958b78d72682cec41 not found: ID does not exist" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.670955 4797 scope.go:117] "RemoveContainer" containerID="defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4" Sep 30 19:11:12 crc kubenswrapper[4797]: E0930 19:11:12.671447 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4\": container with ID starting with defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4 not found: ID does not exist" containerID="defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.671478 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4"} err="failed to get container status \"defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4\": rpc error: code = NotFound desc = could not find container \"defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4\": container with ID starting with defda865a47004255d22872c51f9131738ff20d76d65a6a899bae6e1859bcfb4 not found: ID does not exist" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.671497 4797 scope.go:117] "RemoveContainer" containerID="f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5" Sep 30 19:11:12 crc kubenswrapper[4797]: E0930 19:11:12.671781 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5\": container with ID starting with f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5 not found: ID does not exist" containerID="f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5" Sep 30 19:11:12 crc kubenswrapper[4797]: I0930 19:11:12.671812 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5"} err="failed to get container status \"f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5\": rpc error: code = NotFound desc = could not find container \"f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5\": container with ID starting with f9e2d077332f48527102e9906c733bd467898f68abf55504d844120e577cdaa5 not found: ID does not exist" Sep 30 19:11:14 crc kubenswrapper[4797]: I0930 19:11:14.249901 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" path="/var/lib/kubelet/pods/c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc/volumes" Sep 30 19:13:14 crc kubenswrapper[4797]: I0930 19:13:14.193669 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:13:14 crc kubenswrapper[4797]: I0930 19:13:14.194448 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:13:44 crc kubenswrapper[4797]: I0930 19:13:44.192484 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:13:44 crc kubenswrapper[4797]: I0930 19:13:44.193352 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.191998 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.192767 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.192835 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.194083 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9cdd310b4fd09a9aecb11c790651e043eb641b08219e5af25e9d86de8ef127d8"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.194229 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://9cdd310b4fd09a9aecb11c790651e043eb641b08219e5af25e9d86de8ef127d8" gracePeriod=600 Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.616501 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="9cdd310b4fd09a9aecb11c790651e043eb641b08219e5af25e9d86de8ef127d8" exitCode=0 Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.616558 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"9cdd310b4fd09a9aecb11c790651e043eb641b08219e5af25e9d86de8ef127d8"} Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.617059 4797 scope.go:117] "RemoveContainer" containerID="acd2e40d86f93355db936059fcb637d3d2687b9ead7ab6a70b0d7a2d982dce72" Sep 30 19:14:14 crc kubenswrapper[4797]: I0930 19:14:14.616958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0"} Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.151525 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g"] Sep 30 19:15:00 crc kubenswrapper[4797]: E0930 19:15:00.152376 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="extract-utilities" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.152387 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="extract-utilities" Sep 30 19:15:00 crc kubenswrapper[4797]: E0930 19:15:00.152407 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="extract-content" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.152413 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="extract-content" Sep 30 19:15:00 crc kubenswrapper[4797]: E0930 19:15:00.152420 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="registry-server" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.152426 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="registry-server" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.152651 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="c60d83b0-8a51-48d3-aeff-bd47b6dbc8dc" containerName="registry-server" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.153340 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.156225 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.156665 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.173976 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g"] Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.198062 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d617fe5-4112-4c52-a229-faee72079443-secret-volume\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.198150 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pggcp\" (UniqueName: \"kubernetes.io/projected/0d617fe5-4112-4c52-a229-faee72079443-kube-api-access-pggcp\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.198302 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d617fe5-4112-4c52-a229-faee72079443-config-volume\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.299679 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d617fe5-4112-4c52-a229-faee72079443-config-volume\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.299806 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d617fe5-4112-4c52-a229-faee72079443-secret-volume\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.299861 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pggcp\" (UniqueName: \"kubernetes.io/projected/0d617fe5-4112-4c52-a229-faee72079443-kube-api-access-pggcp\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.300680 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d617fe5-4112-4c52-a229-faee72079443-config-volume\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.306660 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d617fe5-4112-4c52-a229-faee72079443-secret-volume\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.330616 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pggcp\" (UniqueName: \"kubernetes.io/projected/0d617fe5-4112-4c52-a229-faee72079443-kube-api-access-pggcp\") pod \"collect-profiles-29320995-4ss2g\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.474040 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:00 crc kubenswrapper[4797]: I0930 19:15:00.954535 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g"] Sep 30 19:15:00 crc kubenswrapper[4797]: W0930 19:15:00.955379 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d617fe5_4112_4c52_a229_faee72079443.slice/crio-e8b65b471fe4be75665d49674763902c61b2d2c9292d61928943ae7fc76e56aa WatchSource:0}: Error finding container e8b65b471fe4be75665d49674763902c61b2d2c9292d61928943ae7fc76e56aa: Status 404 returned error can't find the container with id e8b65b471fe4be75665d49674763902c61b2d2c9292d61928943ae7fc76e56aa Sep 30 19:15:01 crc kubenswrapper[4797]: I0930 19:15:01.160779 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" event={"ID":"0d617fe5-4112-4c52-a229-faee72079443","Type":"ContainerStarted","Data":"66f34f68b086e717e6998182d40d0eb16f14c57a147d9499d165ce6e6971af45"} Sep 30 19:15:01 crc kubenswrapper[4797]: I0930 19:15:01.161146 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" event={"ID":"0d617fe5-4112-4c52-a229-faee72079443","Type":"ContainerStarted","Data":"e8b65b471fe4be75665d49674763902c61b2d2c9292d61928943ae7fc76e56aa"} Sep 30 19:15:01 crc kubenswrapper[4797]: I0930 19:15:01.180559 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" podStartSLOduration=1.180539539 podStartE2EDuration="1.180539539s" podCreationTimestamp="2025-09-30 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:15:01.177343801 +0000 UTC m=+5551.699843039" watchObservedRunningTime="2025-09-30 19:15:01.180539539 +0000 UTC m=+5551.703038777" Sep 30 19:15:02 crc kubenswrapper[4797]: I0930 19:15:02.171681 4797 generic.go:334] "Generic (PLEG): container finished" podID="0d617fe5-4112-4c52-a229-faee72079443" containerID="66f34f68b086e717e6998182d40d0eb16f14c57a147d9499d165ce6e6971af45" exitCode=0 Sep 30 19:15:02 crc kubenswrapper[4797]: I0930 19:15:02.171728 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" event={"ID":"0d617fe5-4112-4c52-a229-faee72079443","Type":"ContainerDied","Data":"66f34f68b086e717e6998182d40d0eb16f14c57a147d9499d165ce6e6971af45"} Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.650273 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.673885 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d617fe5-4112-4c52-a229-faee72079443-config-volume\") pod \"0d617fe5-4112-4c52-a229-faee72079443\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.673976 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d617fe5-4112-4c52-a229-faee72079443-secret-volume\") pod \"0d617fe5-4112-4c52-a229-faee72079443\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.674100 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pggcp\" (UniqueName: \"kubernetes.io/projected/0d617fe5-4112-4c52-a229-faee72079443-kube-api-access-pggcp\") pod \"0d617fe5-4112-4c52-a229-faee72079443\" (UID: \"0d617fe5-4112-4c52-a229-faee72079443\") " Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.674847 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d617fe5-4112-4c52-a229-faee72079443-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d617fe5-4112-4c52-a229-faee72079443" (UID: "0d617fe5-4112-4c52-a229-faee72079443"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.681799 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d617fe5-4112-4c52-a229-faee72079443-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0d617fe5-4112-4c52-a229-faee72079443" (UID: "0d617fe5-4112-4c52-a229-faee72079443"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.682608 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d617fe5-4112-4c52-a229-faee72079443-kube-api-access-pggcp" (OuterVolumeSpecName: "kube-api-access-pggcp") pod "0d617fe5-4112-4c52-a229-faee72079443" (UID: "0d617fe5-4112-4c52-a229-faee72079443"). InnerVolumeSpecName "kube-api-access-pggcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.776455 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d617fe5-4112-4c52-a229-faee72079443-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.776494 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d617fe5-4112-4c52-a229-faee72079443-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:03 crc kubenswrapper[4797]: I0930 19:15:03.776505 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pggcp\" (UniqueName: \"kubernetes.io/projected/0d617fe5-4112-4c52-a229-faee72079443-kube-api-access-pggcp\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:04 crc kubenswrapper[4797]: I0930 19:15:04.199874 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" event={"ID":"0d617fe5-4112-4c52-a229-faee72079443","Type":"ContainerDied","Data":"e8b65b471fe4be75665d49674763902c61b2d2c9292d61928943ae7fc76e56aa"} Sep 30 19:15:04 crc kubenswrapper[4797]: I0930 19:15:04.199919 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b65b471fe4be75665d49674763902c61b2d2c9292d61928943ae7fc76e56aa" Sep 30 19:15:04 crc kubenswrapper[4797]: I0930 19:15:04.199983 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-4ss2g" Sep 30 19:15:04 crc kubenswrapper[4797]: I0930 19:15:04.270964 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8"] Sep 30 19:15:04 crc kubenswrapper[4797]: I0930 19:15:04.282246 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320950-g7wk8"] Sep 30 19:15:06 crc kubenswrapper[4797]: I0930 19:15:06.252136 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c273c7c-5815-4b45-b341-67a5cb16a202" path="/var/lib/kubelet/pods/1c273c7c-5815-4b45-b341-67a5cb16a202/volumes" Sep 30 19:15:18 crc kubenswrapper[4797]: I0930 19:15:18.322863 4797 scope.go:117] "RemoveContainer" containerID="4bc69441cdfad1fb9274e4975522fee6a1e17eacecc77a26209380554c43b814" Sep 30 19:15:44 crc kubenswrapper[4797]: I0930 19:15:44.954773 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g82wb"] Sep 30 19:15:44 crc kubenswrapper[4797]: E0930 19:15:44.956907 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d617fe5-4112-4c52-a229-faee72079443" containerName="collect-profiles" Sep 30 19:15:44 crc kubenswrapper[4797]: I0930 19:15:44.956958 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d617fe5-4112-4c52-a229-faee72079443" containerName="collect-profiles" Sep 30 19:15:44 crc kubenswrapper[4797]: I0930 19:15:44.957501 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d617fe5-4112-4c52-a229-faee72079443" containerName="collect-profiles" Sep 30 19:15:44 crc kubenswrapper[4797]: I0930 19:15:44.960087 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:44 crc kubenswrapper[4797]: I0930 19:15:44.973031 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g82wb"] Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.088297 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-catalog-content\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.088412 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-utilities\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.088494 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gx74\" (UniqueName: \"kubernetes.io/projected/caf83e07-a6e1-4316-8e92-4b6873470da2-kube-api-access-8gx74\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.189727 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-utilities\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.189814 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx74\" (UniqueName: \"kubernetes.io/projected/caf83e07-a6e1-4316-8e92-4b6873470da2-kube-api-access-8gx74\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.189924 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-catalog-content\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.190332 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-utilities\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.190479 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-catalog-content\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.217570 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gx74\" (UniqueName: \"kubernetes.io/projected/caf83e07-a6e1-4316-8e92-4b6873470da2-kube-api-access-8gx74\") pod \"certified-operators-g82wb\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.288303 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:45 crc kubenswrapper[4797]: I0930 19:15:45.806140 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g82wb"] Sep 30 19:15:46 crc kubenswrapper[4797]: I0930 19:15:46.666674 4797 generic.go:334] "Generic (PLEG): container finished" podID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerID="990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43" exitCode=0 Sep 30 19:15:46 crc kubenswrapper[4797]: I0930 19:15:46.666763 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g82wb" event={"ID":"caf83e07-a6e1-4316-8e92-4b6873470da2","Type":"ContainerDied","Data":"990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43"} Sep 30 19:15:46 crc kubenswrapper[4797]: I0930 19:15:46.667000 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g82wb" event={"ID":"caf83e07-a6e1-4316-8e92-4b6873470da2","Type":"ContainerStarted","Data":"3259f12835703695ded593ab561c45e5cb506146ebcd83a5a8ee6b160fe5add0"} Sep 30 19:15:46 crc kubenswrapper[4797]: I0930 19:15:46.670737 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:15:48 crc kubenswrapper[4797]: I0930 19:15:48.695908 4797 generic.go:334] "Generic (PLEG): container finished" podID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerID="0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00" exitCode=0 Sep 30 19:15:48 crc kubenswrapper[4797]: I0930 19:15:48.696407 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g82wb" event={"ID":"caf83e07-a6e1-4316-8e92-4b6873470da2","Type":"ContainerDied","Data":"0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00"} Sep 30 19:15:49 crc kubenswrapper[4797]: I0930 19:15:49.755357 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g82wb" podStartSLOduration=3.174905607 podStartE2EDuration="5.755333113s" podCreationTimestamp="2025-09-30 19:15:44 +0000 UTC" firstStartedPulling="2025-09-30 19:15:46.670367995 +0000 UTC m=+5597.192867273" lastFinishedPulling="2025-09-30 19:15:49.250795541 +0000 UTC m=+5599.773294779" observedRunningTime="2025-09-30 19:15:49.751340723 +0000 UTC m=+5600.273839961" watchObservedRunningTime="2025-09-30 19:15:49.755333113 +0000 UTC m=+5600.277832351" Sep 30 19:15:50 crc kubenswrapper[4797]: I0930 19:15:50.740288 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g82wb" event={"ID":"caf83e07-a6e1-4316-8e92-4b6873470da2","Type":"ContainerStarted","Data":"fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d"} Sep 30 19:15:55 crc kubenswrapper[4797]: I0930 19:15:55.289029 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:55 crc kubenswrapper[4797]: I0930 19:15:55.289617 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:55 crc kubenswrapper[4797]: I0930 19:15:55.362737 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:55 crc kubenswrapper[4797]: I0930 19:15:55.851749 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:55 crc kubenswrapper[4797]: I0930 19:15:55.901457 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g82wb"] Sep 30 19:15:57 crc kubenswrapper[4797]: I0930 19:15:57.817860 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g82wb" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="registry-server" containerID="cri-o://fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d" gracePeriod=2 Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.377833 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.482397 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-catalog-content\") pod \"caf83e07-a6e1-4316-8e92-4b6873470da2\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.482827 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-utilities\") pod \"caf83e07-a6e1-4316-8e92-4b6873470da2\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.482893 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gx74\" (UniqueName: \"kubernetes.io/projected/caf83e07-a6e1-4316-8e92-4b6873470da2-kube-api-access-8gx74\") pod \"caf83e07-a6e1-4316-8e92-4b6873470da2\" (UID: \"caf83e07-a6e1-4316-8e92-4b6873470da2\") " Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.484222 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-utilities" (OuterVolumeSpecName: "utilities") pod "caf83e07-a6e1-4316-8e92-4b6873470da2" (UID: "caf83e07-a6e1-4316-8e92-4b6873470da2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.490896 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf83e07-a6e1-4316-8e92-4b6873470da2-kube-api-access-8gx74" (OuterVolumeSpecName: "kube-api-access-8gx74") pod "caf83e07-a6e1-4316-8e92-4b6873470da2" (UID: "caf83e07-a6e1-4316-8e92-4b6873470da2"). InnerVolumeSpecName "kube-api-access-8gx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.556845 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caf83e07-a6e1-4316-8e92-4b6873470da2" (UID: "caf83e07-a6e1-4316-8e92-4b6873470da2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.585265 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.585341 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gx74\" (UniqueName: \"kubernetes.io/projected/caf83e07-a6e1-4316-8e92-4b6873470da2-kube-api-access-8gx74\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.585372 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf83e07-a6e1-4316-8e92-4b6873470da2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.832830 4797 generic.go:334] "Generic (PLEG): container finished" podID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerID="fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d" exitCode=0 Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.832895 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g82wb" event={"ID":"caf83e07-a6e1-4316-8e92-4b6873470da2","Type":"ContainerDied","Data":"fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d"} Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.832937 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g82wb" event={"ID":"caf83e07-a6e1-4316-8e92-4b6873470da2","Type":"ContainerDied","Data":"3259f12835703695ded593ab561c45e5cb506146ebcd83a5a8ee6b160fe5add0"} Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.832966 4797 scope.go:117] "RemoveContainer" containerID="fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.833160 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g82wb" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.869909 4797 scope.go:117] "RemoveContainer" containerID="0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.880455 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g82wb"] Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.891033 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g82wb"] Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.912547 4797 scope.go:117] "RemoveContainer" containerID="990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.953413 4797 scope.go:117] "RemoveContainer" containerID="fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d" Sep 30 19:15:58 crc kubenswrapper[4797]: E0930 19:15:58.960644 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d\": container with ID starting with fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d not found: ID does not exist" containerID="fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.960707 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d"} err="failed to get container status \"fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d\": rpc error: code = NotFound desc = could not find container \"fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d\": container with ID starting with fff5f4ab890e7481721cfa7223d9ea131f9ecfea2886988c00c6c9f27bd1893d not found: ID does not exist" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.960743 4797 scope.go:117] "RemoveContainer" containerID="0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00" Sep 30 19:15:58 crc kubenswrapper[4797]: E0930 19:15:58.961304 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00\": container with ID starting with 0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00 not found: ID does not exist" containerID="0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.961349 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00"} err="failed to get container status \"0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00\": rpc error: code = NotFound desc = could not find container \"0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00\": container with ID starting with 0d5398d9e7c16eb635efd4a5619b7f03fa4668c955e392109dc0c62dbe1f2b00 not found: ID does not exist" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.961388 4797 scope.go:117] "RemoveContainer" containerID="990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43" Sep 30 19:15:58 crc kubenswrapper[4797]: E0930 19:15:58.961847 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43\": container with ID starting with 990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43 not found: ID does not exist" containerID="990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43" Sep 30 19:15:58 crc kubenswrapper[4797]: I0930 19:15:58.961884 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43"} err="failed to get container status \"990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43\": rpc error: code = NotFound desc = could not find container \"990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43\": container with ID starting with 990d4fdeb0528df482634f339732232df6ce1e1cfefc9b5a904b401c236e4b43 not found: ID does not exist" Sep 30 19:16:00 crc kubenswrapper[4797]: I0930 19:16:00.256546 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" path="/var/lib/kubelet/pods/caf83e07-a6e1-4316-8e92-4b6873470da2/volumes" Sep 30 19:16:08 crc kubenswrapper[4797]: I0930 19:16:08.939980 4797 generic.go:334] "Generic (PLEG): container finished" podID="05275916-b3be-4d53-8a06-ab3d5c8b3f7b" containerID="24486d04c2bca481ba2dd1fa05c0bb4a4a9591005d7f4bb913ff987d688382e9" exitCode=0 Sep 30 19:16:08 crc kubenswrapper[4797]: I0930 19:16:08.940107 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"05275916-b3be-4d53-8a06-ab3d5c8b3f7b","Type":"ContainerDied","Data":"24486d04c2bca481ba2dd1fa05c0bb4a4a9591005d7f4bb913ff987d688382e9"} Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.341350 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.522233 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config-secret\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.522376 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhr2r\" (UniqueName: \"kubernetes.io/projected/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-kube-api-access-hhr2r\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.522406 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-workdir\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.522464 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ca-certs\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.522517 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-temporary\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.522540 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ssh-key\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.523049 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.523334 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-config-data\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.523415 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.523478 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\" (UID: \"05275916-b3be-4d53-8a06-ab3d5c8b3f7b\") " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.524296 4797 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.524659 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-config-data" (OuterVolumeSpecName: "config-data") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.528547 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.536946 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-kube-api-access-hhr2r" (OuterVolumeSpecName: "kube-api-access-hhr2r") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "kube-api-access-hhr2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.552531 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.557692 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.558243 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.586331 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626404 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626454 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhr2r\" (UniqueName: \"kubernetes.io/projected/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-kube-api-access-hhr2r\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626470 4797 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626482 4797 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626493 4797 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626506 4797 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.626541 4797 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.655823 4797 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.728419 4797 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.917758 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "05275916-b3be-4d53-8a06-ab3d5c8b3f7b" (UID: "05275916-b3be-4d53-8a06-ab3d5c8b3f7b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.932198 4797 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/05275916-b3be-4d53-8a06-ab3d5c8b3f7b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.966916 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"05275916-b3be-4d53-8a06-ab3d5c8b3f7b","Type":"ContainerDied","Data":"5ad9aa9d3899cb4746565fb20d6078fbd3dabcfd76315e26e69794abcfe249fd"} Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.966969 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad9aa9d3899cb4746565fb20d6078fbd3dabcfd76315e26e69794abcfe249fd" Sep 30 19:16:10 crc kubenswrapper[4797]: I0930 19:16:10.967012 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.198564 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.199115 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.911382 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 19:16:14 crc kubenswrapper[4797]: E0930 19:16:14.912251 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="extract-content" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.912272 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="extract-content" Sep 30 19:16:14 crc kubenswrapper[4797]: E0930 19:16:14.912313 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="extract-utilities" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.912324 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="extract-utilities" Sep 30 19:16:14 crc kubenswrapper[4797]: E0930 19:16:14.912347 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05275916-b3be-4d53-8a06-ab3d5c8b3f7b" containerName="tempest-tests-tempest-tests-runner" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.912359 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="05275916-b3be-4d53-8a06-ab3d5c8b3f7b" containerName="tempest-tests-tempest-tests-runner" Sep 30 19:16:14 crc kubenswrapper[4797]: E0930 19:16:14.912390 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="registry-server" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.912402 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="registry-server" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.912767 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="05275916-b3be-4d53-8a06-ab3d5c8b3f7b" containerName="tempest-tests-tempest-tests-runner" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.912822 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf83e07-a6e1-4316-8e92-4b6873470da2" containerName="registry-server" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.913913 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.916161 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mcnmz" Sep 30 19:16:14 crc kubenswrapper[4797]: I0930 19:16:14.925365 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.014569 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9l95\" (UniqueName: \"kubernetes.io/projected/2754f75a-653d-4ec9-8bac-e81f0353ec88-kube-api-access-k9l95\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.014835 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.116853 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.117348 4797 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.117536 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9l95\" (UniqueName: \"kubernetes.io/projected/2754f75a-653d-4ec9-8bac-e81f0353ec88-kube-api-access-k9l95\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.148881 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9l95\" (UniqueName: \"kubernetes.io/projected/2754f75a-653d-4ec9-8bac-e81f0353ec88-kube-api-access-k9l95\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.152729 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2754f75a-653d-4ec9-8bac-e81f0353ec88\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.250050 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 19:16:15 crc kubenswrapper[4797]: I0930 19:16:15.745003 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 19:16:16 crc kubenswrapper[4797]: I0930 19:16:16.021578 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2754f75a-653d-4ec9-8bac-e81f0353ec88","Type":"ContainerStarted","Data":"a0fac469e59b762647726baee859a008f2fd3ce5f883bc2cf33bf25c974e5770"} Sep 30 19:16:17 crc kubenswrapper[4797]: I0930 19:16:17.032199 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2754f75a-653d-4ec9-8bac-e81f0353ec88","Type":"ContainerStarted","Data":"4630cf0fa00663d50c8e8eaeca28e9b5875d45ebd24e6fa71788159b478a8296"} Sep 30 19:16:17 crc kubenswrapper[4797]: I0930 19:16:17.058345 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.208721411 podStartE2EDuration="3.058321183s" podCreationTimestamp="2025-09-30 19:16:14 +0000 UTC" firstStartedPulling="2025-09-30 19:16:15.747964711 +0000 UTC m=+5626.270463949" lastFinishedPulling="2025-09-30 19:16:16.597564463 +0000 UTC m=+5627.120063721" observedRunningTime="2025-09-30 19:16:17.051972158 +0000 UTC m=+5627.574471416" watchObservedRunningTime="2025-09-30 19:16:17.058321183 +0000 UTC m=+5627.580820441" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.614898 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7qqgx/must-gather-hzj2c"] Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.617005 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.619586 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7qqgx"/"openshift-service-ca.crt" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.619731 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7qqgx"/"default-dockercfg-b22dj" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.620420 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7qqgx"/"kube-root-ca.crt" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.628720 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7qqgx/must-gather-hzj2c"] Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.739103 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8tw2\" (UniqueName: \"kubernetes.io/projected/33788a93-3dcc-4003-be30-4fe40760f228-kube-api-access-f8tw2\") pod \"must-gather-hzj2c\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.739307 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33788a93-3dcc-4003-be30-4fe40760f228-must-gather-output\") pod \"must-gather-hzj2c\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.841185 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8tw2\" (UniqueName: \"kubernetes.io/projected/33788a93-3dcc-4003-be30-4fe40760f228-kube-api-access-f8tw2\") pod \"must-gather-hzj2c\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.841356 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33788a93-3dcc-4003-be30-4fe40760f228-must-gather-output\") pod \"must-gather-hzj2c\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.841832 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33788a93-3dcc-4003-be30-4fe40760f228-must-gather-output\") pod \"must-gather-hzj2c\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.869008 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8tw2\" (UniqueName: \"kubernetes.io/projected/33788a93-3dcc-4003-be30-4fe40760f228-kube-api-access-f8tw2\") pod \"must-gather-hzj2c\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:34 crc kubenswrapper[4797]: I0930 19:16:34.933188 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:16:35 crc kubenswrapper[4797]: W0930 19:16:35.395457 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33788a93_3dcc_4003_be30_4fe40760f228.slice/crio-2ad0a8ef3b147c43d207e29372f82a6761db6627160a5c9b30da1706c776eae9 WatchSource:0}: Error finding container 2ad0a8ef3b147c43d207e29372f82a6761db6627160a5c9b30da1706c776eae9: Status 404 returned error can't find the container with id 2ad0a8ef3b147c43d207e29372f82a6761db6627160a5c9b30da1706c776eae9 Sep 30 19:16:35 crc kubenswrapper[4797]: I0930 19:16:35.398296 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7qqgx/must-gather-hzj2c"] Sep 30 19:16:36 crc kubenswrapper[4797]: I0930 19:16:36.257891 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" event={"ID":"33788a93-3dcc-4003-be30-4fe40760f228","Type":"ContainerStarted","Data":"2ad0a8ef3b147c43d207e29372f82a6761db6627160a5c9b30da1706c776eae9"} Sep 30 19:16:39 crc kubenswrapper[4797]: I0930 19:16:39.792837 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6kqfz"] Sep 30 19:16:39 crc kubenswrapper[4797]: I0930 19:16:39.797018 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:39 crc kubenswrapper[4797]: I0930 19:16:39.804873 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kqfz"] Sep 30 19:16:39 crc kubenswrapper[4797]: I0930 19:16:39.946668 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-catalog-content\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:39 crc kubenswrapper[4797]: I0930 19:16:39.946750 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-utilities\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:39 crc kubenswrapper[4797]: I0930 19:16:39.946812 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsll\" (UniqueName: \"kubernetes.io/projected/5871517d-6d00-4437-8826-01e7e49b68c7-kube-api-access-fqsll\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.048558 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsll\" (UniqueName: \"kubernetes.io/projected/5871517d-6d00-4437-8826-01e7e49b68c7-kube-api-access-fqsll\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.048830 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-catalog-content\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.048903 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-utilities\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.049397 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-utilities\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.049554 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-catalog-content\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.066806 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsll\" (UniqueName: \"kubernetes.io/projected/5871517d-6d00-4437-8826-01e7e49b68c7-kube-api-access-fqsll\") pod \"redhat-marketplace-6kqfz\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.170332 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.319328 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" event={"ID":"33788a93-3dcc-4003-be30-4fe40760f228","Type":"ContainerStarted","Data":"6afe6e3f75c02423708ccf1a0c1e44197abb51d475b6b583ab3cc71792a6e1b6"} Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.319726 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" event={"ID":"33788a93-3dcc-4003-be30-4fe40760f228","Type":"ContainerStarted","Data":"32b17e9e780f59b5b8cdbad93a293629412d4ba542ff374df6d9161a65b24db7"} Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.340833 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" podStartSLOduration=2.230258184 podStartE2EDuration="6.34081628s" podCreationTimestamp="2025-09-30 19:16:34 +0000 UTC" firstStartedPulling="2025-09-30 19:16:35.397281318 +0000 UTC m=+5645.919780556" lastFinishedPulling="2025-09-30 19:16:39.507839414 +0000 UTC m=+5650.030338652" observedRunningTime="2025-09-30 19:16:40.334841285 +0000 UTC m=+5650.857340533" watchObservedRunningTime="2025-09-30 19:16:40.34081628 +0000 UTC m=+5650.863315518" Sep 30 19:16:40 crc kubenswrapper[4797]: I0930 19:16:40.714564 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kqfz"] Sep 30 19:16:41 crc kubenswrapper[4797]: I0930 19:16:41.328820 4797 generic.go:334] "Generic (PLEG): container finished" podID="5871517d-6d00-4437-8826-01e7e49b68c7" containerID="44c8220b75ff0619639533a0814ffe2569e6bbc25124faa1bcbae271d69365bc" exitCode=0 Sep 30 19:16:41 crc kubenswrapper[4797]: I0930 19:16:41.330663 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerDied","Data":"44c8220b75ff0619639533a0814ffe2569e6bbc25124faa1bcbae271d69365bc"} Sep 30 19:16:41 crc kubenswrapper[4797]: I0930 19:16:41.330698 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerStarted","Data":"7dd2b3ac6d0d804e8375b74d97fec12e53da6784410eb79e68feb305032e2c68"} Sep 30 19:16:42 crc kubenswrapper[4797]: I0930 19:16:42.339753 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerStarted","Data":"7af465b781cd56b3ed25c608a1ca4bdfd03a0ba17ea53292c786350e788c62f4"} Sep 30 19:16:42 crc kubenswrapper[4797]: E0930 19:16:42.581889 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:47490->38.102.83.119:33393: write tcp 38.102.83.119:47490->38.102.83.119:33393: write: broken pipe Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.350826 4797 generic.go:334] "Generic (PLEG): container finished" podID="5871517d-6d00-4437-8826-01e7e49b68c7" containerID="7af465b781cd56b3ed25c608a1ca4bdfd03a0ba17ea53292c786350e788c62f4" exitCode=0 Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.350912 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerDied","Data":"7af465b781cd56b3ed25c608a1ca4bdfd03a0ba17ea53292c786350e788c62f4"} Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.512245 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-vplll"] Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.513466 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.623980 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4892bf-2e95-498a-8577-e581647ce90d-host\") pod \"crc-debug-vplll\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.624821 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8bd\" (UniqueName: \"kubernetes.io/projected/3f4892bf-2e95-498a-8577-e581647ce90d-kube-api-access-5k8bd\") pod \"crc-debug-vplll\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.727087 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8bd\" (UniqueName: \"kubernetes.io/projected/3f4892bf-2e95-498a-8577-e581647ce90d-kube-api-access-5k8bd\") pod \"crc-debug-vplll\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.727231 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4892bf-2e95-498a-8577-e581647ce90d-host\") pod \"crc-debug-vplll\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.727388 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4892bf-2e95-498a-8577-e581647ce90d-host\") pod \"crc-debug-vplll\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.743617 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8bd\" (UniqueName: \"kubernetes.io/projected/3f4892bf-2e95-498a-8577-e581647ce90d-kube-api-access-5k8bd\") pod \"crc-debug-vplll\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: I0930 19:16:43.840798 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:16:43 crc kubenswrapper[4797]: W0930 19:16:43.882170 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4892bf_2e95_498a_8577_e581647ce90d.slice/crio-66b53e59b3381f70fdfa6e0df0f0e0af4161fff0f005a4b106e86e06c72bb3f0 WatchSource:0}: Error finding container 66b53e59b3381f70fdfa6e0df0f0e0af4161fff0f005a4b106e86e06c72bb3f0: Status 404 returned error can't find the container with id 66b53e59b3381f70fdfa6e0df0f0e0af4161fff0f005a4b106e86e06c72bb3f0 Sep 30 19:16:44 crc kubenswrapper[4797]: I0930 19:16:44.191564 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:16:44 crc kubenswrapper[4797]: I0930 19:16:44.192195 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:16:44 crc kubenswrapper[4797]: I0930 19:16:44.364091 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerStarted","Data":"fee85d280d9d6c8c4dc88370d0129cebbd6716d2dbc1aa611efa3120a2bf65df"} Sep 30 19:16:44 crc kubenswrapper[4797]: I0930 19:16:44.365764 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-vplll" event={"ID":"3f4892bf-2e95-498a-8577-e581647ce90d","Type":"ContainerStarted","Data":"66b53e59b3381f70fdfa6e0df0f0e0af4161fff0f005a4b106e86e06c72bb3f0"} Sep 30 19:16:44 crc kubenswrapper[4797]: I0930 19:16:44.388461 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6kqfz" podStartSLOduration=2.93976965 podStartE2EDuration="5.388415946s" podCreationTimestamp="2025-09-30 19:16:39 +0000 UTC" firstStartedPulling="2025-09-30 19:16:41.331641042 +0000 UTC m=+5651.854140270" lastFinishedPulling="2025-09-30 19:16:43.780287328 +0000 UTC m=+5654.302786566" observedRunningTime="2025-09-30 19:16:44.382613046 +0000 UTC m=+5654.905112284" watchObservedRunningTime="2025-09-30 19:16:44.388415946 +0000 UTC m=+5654.910915174" Sep 30 19:16:50 crc kubenswrapper[4797]: I0930 19:16:50.171374 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:50 crc kubenswrapper[4797]: I0930 19:16:50.171916 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:50 crc kubenswrapper[4797]: I0930 19:16:50.310253 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:50 crc kubenswrapper[4797]: I0930 19:16:50.497277 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:50 crc kubenswrapper[4797]: I0930 19:16:50.546534 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kqfz"] Sep 30 19:16:52 crc kubenswrapper[4797]: I0930 19:16:52.465159 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6kqfz" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="registry-server" containerID="cri-o://fee85d280d9d6c8c4dc88370d0129cebbd6716d2dbc1aa611efa3120a2bf65df" gracePeriod=2 Sep 30 19:16:53 crc kubenswrapper[4797]: I0930 19:16:53.480050 4797 generic.go:334] "Generic (PLEG): container finished" podID="5871517d-6d00-4437-8826-01e7e49b68c7" containerID="fee85d280d9d6c8c4dc88370d0129cebbd6716d2dbc1aa611efa3120a2bf65df" exitCode=0 Sep 30 19:16:53 crc kubenswrapper[4797]: I0930 19:16:53.480101 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerDied","Data":"fee85d280d9d6c8c4dc88370d0129cebbd6716d2dbc1aa611efa3120a2bf65df"} Sep 30 19:16:54 crc kubenswrapper[4797]: I0930 19:16:54.963877 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.068407 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsll\" (UniqueName: \"kubernetes.io/projected/5871517d-6d00-4437-8826-01e7e49b68c7-kube-api-access-fqsll\") pod \"5871517d-6d00-4437-8826-01e7e49b68c7\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.068582 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-utilities\") pod \"5871517d-6d00-4437-8826-01e7e49b68c7\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.068727 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-catalog-content\") pod \"5871517d-6d00-4437-8826-01e7e49b68c7\" (UID: \"5871517d-6d00-4437-8826-01e7e49b68c7\") " Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.072969 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-utilities" (OuterVolumeSpecName: "utilities") pod "5871517d-6d00-4437-8826-01e7e49b68c7" (UID: "5871517d-6d00-4437-8826-01e7e49b68c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.079894 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5871517d-6d00-4437-8826-01e7e49b68c7-kube-api-access-fqsll" (OuterVolumeSpecName: "kube-api-access-fqsll") pod "5871517d-6d00-4437-8826-01e7e49b68c7" (UID: "5871517d-6d00-4437-8826-01e7e49b68c7"). InnerVolumeSpecName "kube-api-access-fqsll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.083082 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5871517d-6d00-4437-8826-01e7e49b68c7" (UID: "5871517d-6d00-4437-8826-01e7e49b68c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.170839 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.171111 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsll\" (UniqueName: \"kubernetes.io/projected/5871517d-6d00-4437-8826-01e7e49b68c7-kube-api-access-fqsll\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.171124 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5871517d-6d00-4437-8826-01e7e49b68c7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.506347 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kqfz" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.506381 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kqfz" event={"ID":"5871517d-6d00-4437-8826-01e7e49b68c7","Type":"ContainerDied","Data":"7dd2b3ac6d0d804e8375b74d97fec12e53da6784410eb79e68feb305032e2c68"} Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.506461 4797 scope.go:117] "RemoveContainer" containerID="fee85d280d9d6c8c4dc88370d0129cebbd6716d2dbc1aa611efa3120a2bf65df" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.512502 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-vplll" event={"ID":"3f4892bf-2e95-498a-8577-e581647ce90d","Type":"ContainerStarted","Data":"17047573d5c6f98dd133f7068d76e4e3757d2f21c0e8cece83f4cf8d255dfd60"} Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.538912 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7qqgx/crc-debug-vplll" podStartSLOduration=1.808761746 podStartE2EDuration="12.538890453s" podCreationTimestamp="2025-09-30 19:16:43 +0000 UTC" firstStartedPulling="2025-09-30 19:16:43.885249662 +0000 UTC m=+5654.407748900" lastFinishedPulling="2025-09-30 19:16:54.615378369 +0000 UTC m=+5665.137877607" observedRunningTime="2025-09-30 19:16:55.524792065 +0000 UTC m=+5666.047291303" watchObservedRunningTime="2025-09-30 19:16:55.538890453 +0000 UTC m=+5666.061389681" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.575046 4797 scope.go:117] "RemoveContainer" containerID="7af465b781cd56b3ed25c608a1ca4bdfd03a0ba17ea53292c786350e788c62f4" Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.588610 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kqfz"] Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.599479 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kqfz"] Sep 30 19:16:55 crc kubenswrapper[4797]: I0930 19:16:55.611336 4797 scope.go:117] "RemoveContainer" containerID="44c8220b75ff0619639533a0814ffe2569e6bbc25124faa1bcbae271d69365bc" Sep 30 19:16:56 crc kubenswrapper[4797]: I0930 19:16:56.248854 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" path="/var/lib/kubelet/pods/5871517d-6d00-4437-8826-01e7e49b68c7/volumes" Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.191641 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.192256 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.192302 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.193174 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.193237 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" gracePeriod=600 Sep 30 19:17:14 crc kubenswrapper[4797]: E0930 19:17:14.338648 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.694512 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" exitCode=0 Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.694599 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0"} Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.694647 4797 scope.go:117] "RemoveContainer" containerID="9cdd310b4fd09a9aecb11c790651e043eb641b08219e5af25e9d86de8ef127d8" Sep 30 19:17:14 crc kubenswrapper[4797]: I0930 19:17:14.695567 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:17:14 crc kubenswrapper[4797]: E0930 19:17:14.696075 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:17:29 crc kubenswrapper[4797]: I0930 19:17:29.238756 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:17:29 crc kubenswrapper[4797]: E0930 19:17:29.239610 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:17:42 crc kubenswrapper[4797]: I0930 19:17:42.238621 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:17:42 crc kubenswrapper[4797]: E0930 19:17:42.239341 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:17:53 crc kubenswrapper[4797]: I0930 19:17:53.242927 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:17:53 crc kubenswrapper[4797]: E0930 19:17:53.244678 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:18:06 crc kubenswrapper[4797]: I0930 19:18:06.239394 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:18:06 crc kubenswrapper[4797]: E0930 19:18:06.245493 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:18:08 crc kubenswrapper[4797]: I0930 19:18:08.212094 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c5546bcbd-84n4q_9365f350-1fad-4ab1-a694-49912e391383/barbican-api/0.log" Sep 30 19:18:08 crc kubenswrapper[4797]: I0930 19:18:08.391225 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c5546bcbd-84n4q_9365f350-1fad-4ab1-a694-49912e391383/barbican-api-log/0.log" Sep 30 19:18:08 crc kubenswrapper[4797]: I0930 19:18:08.597871 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5768c5b854-k959d_9c9236f9-becb-4d5c-aeb5-56a3b0547c86/barbican-keystone-listener/0.log" Sep 30 19:18:08 crc kubenswrapper[4797]: I0930 19:18:08.784499 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5768c5b854-k959d_9c9236f9-becb-4d5c-aeb5-56a3b0547c86/barbican-keystone-listener-log/0.log" Sep 30 19:18:09 crc kubenswrapper[4797]: I0930 19:18:09.000634 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b75d759d5-6bwm5_080f211c-e410-4f16-af62-78ce0d6d9d26/barbican-worker/0.log" Sep 30 19:18:09 crc kubenswrapper[4797]: I0930 19:18:09.051848 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b75d759d5-6bwm5_080f211c-e410-4f16-af62-78ce0d6d9d26/barbican-worker-log/0.log" Sep 30 19:18:09 crc kubenswrapper[4797]: I0930 19:18:09.262616 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg_76ed1105-fad0-4d4d-9039-06795b66a457/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:09 crc kubenswrapper[4797]: I0930 19:18:09.662748 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/ceilometer-central-agent/0.log" Sep 30 19:18:09 crc kubenswrapper[4797]: I0930 19:18:09.844586 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/proxy-httpd/0.log" Sep 30 19:18:09 crc kubenswrapper[4797]: I0930 19:18:09.860763 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/ceilometer-notification-agent/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.005941 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/sg-core/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.263352 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9f1efb9-4e3d-4371-bd43-55cffbe2d06d/cinder-api/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.272546 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9f1efb9-4e3d-4371-bd43-55cffbe2d06d/cinder-api-log/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.490711 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6910128f-6ddf-4edf-86b6-a313f85db70d/cinder-scheduler/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.567025 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6910128f-6ddf-4edf-86b6-a313f85db70d/probe/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.671793 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-k9785_49ebc230-80f5-4bd0-a2fb-91cd9705a000/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:10 crc kubenswrapper[4797]: I0930 19:18:10.857640 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-g45vj_d44c9877-2212-4102-8f03-d2cf682cf7b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.007879 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct_994f3ed9-cd04-48c1-a7ab-f0c3d08b5858/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.125285 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-sc66n_28db4edb-04c5-44de-917b-8578fa6c4031/init/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.283038 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-sc66n_28db4edb-04c5-44de-917b-8578fa6c4031/init/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.495106 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h_deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.510515 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-sc66n_28db4edb-04c5-44de-917b-8578fa6c4031/dnsmasq-dns/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.673343 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7484ca2f-31fc-4ede-bdcc-2ce25e4d5023/glance-log/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.723992 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7484ca2f-31fc-4ede-bdcc-2ce25e4d5023/glance-httpd/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.890184 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2bd684cf-1443-4068-8b8f-7b1961474c80/glance-httpd/0.log" Sep 30 19:18:11 crc kubenswrapper[4797]: I0930 19:18:11.909502 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2bd684cf-1443-4068-8b8f-7b1961474c80/glance-log/0.log" Sep 30 19:18:12 crc kubenswrapper[4797]: I0930 19:18:12.213881 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6676d4ddcd-sxf6l_04e30fb7-7876-4a90-b887-05b7da2f7746/horizon/1.log" Sep 30 19:18:12 crc kubenswrapper[4797]: I0930 19:18:12.228518 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6676d4ddcd-sxf6l_04e30fb7-7876-4a90-b887-05b7da2f7746/horizon/0.log" Sep 30 19:18:12 crc kubenswrapper[4797]: I0930 19:18:12.456294 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-65dn7_047af11f-ac90-41df-96c5-be75581aff10/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:12 crc kubenswrapper[4797]: I0930 19:18:12.706493 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-k5jg9_2ab411d2-db8f-47ff-9233-739acad6d3ee/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:12 crc kubenswrapper[4797]: I0930 19:18:12.905911 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6676d4ddcd-sxf6l_04e30fb7-7876-4a90-b887-05b7da2f7746/horizon-log/0.log" Sep 30 19:18:13 crc kubenswrapper[4797]: I0930 19:18:13.131869 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320981-sk9mv_90f2ee8c-9aaa-4cbc-bde8-25bc8a297045/keystone-cron/0.log" Sep 30 19:18:13 crc kubenswrapper[4797]: I0930 19:18:13.293836 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1084da2e-fefb-4741-89c4-90257f878bf8/kube-state-metrics/0.log" Sep 30 19:18:13 crc kubenswrapper[4797]: I0930 19:18:13.368483 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b748fb867-znqws_af775f0c-a3ef-4bd7-bf2e-cecdacda03ff/keystone-api/0.log" Sep 30 19:18:13 crc kubenswrapper[4797]: I0930 19:18:13.548759 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vzb94_a008ddae-ddb5-47a3-9423-0da1ffdb8322/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:14 crc kubenswrapper[4797]: I0930 19:18:14.069927 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fc4d4d55c-fzms2_6726c7e0-d359-494e-9a9b-54c878d16e6b/neutron-httpd/0.log" Sep 30 19:18:14 crc kubenswrapper[4797]: I0930 19:18:14.145543 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fc4d4d55c-fzms2_6726c7e0-d359-494e-9a9b-54c878d16e6b/neutron-api/0.log" Sep 30 19:18:14 crc kubenswrapper[4797]: I0930 19:18:14.252498 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q_93bccc4f-33ad-45b9-9549-20ba5484888f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:15 crc kubenswrapper[4797]: I0930 19:18:15.106531 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_627d236c-592a-46d3-9ef3-5adc1749c0c9/nova-cell0-conductor-conductor/0.log" Sep 30 19:18:15 crc kubenswrapper[4797]: I0930 19:18:15.676935 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_549d2ac6-4a20-4698-ad97-b4a94dab16e0/nova-cell1-conductor-conductor/0.log" Sep 30 19:18:15 crc kubenswrapper[4797]: I0930 19:18:15.762581 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e0b40b7a-54a6-4fb4-868d-85f26823aeb3/nova-api-log/0.log" Sep 30 19:18:16 crc kubenswrapper[4797]: I0930 19:18:16.050842 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e0b40b7a-54a6-4fb4-868d-85f26823aeb3/nova-api-api/0.log" Sep 30 19:18:16 crc kubenswrapper[4797]: I0930 19:18:16.104594 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e16d4537-8c99-431f-bcd6-da24200f085b/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 19:18:16 crc kubenswrapper[4797]: I0930 19:18:16.401401 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-92w66_cd2701f6-0eb3-4359-a16d-7435179896c0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:16 crc kubenswrapper[4797]: I0930 19:18:16.488450 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4225b925-9d11-4b4b-8e2d-1063584ef26c/nova-metadata-log/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.033505 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b/nova-scheduler-scheduler/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.057746 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9b64d922-5a17-4831-b87e-78ae0a9a9042/mysql-bootstrap/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.245867 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9b64d922-5a17-4831-b87e-78ae0a9a9042/galera/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.290765 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9b64d922-5a17-4831-b87e-78ae0a9a9042/mysql-bootstrap/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.561043 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c386be5e-6533-42b6-8a82-512c4c60cab2/mysql-bootstrap/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.786885 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c386be5e-6533-42b6-8a82-512c4c60cab2/mysql-bootstrap/0.log" Sep 30 19:18:17 crc kubenswrapper[4797]: I0930 19:18:17.864620 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c386be5e-6533-42b6-8a82-512c4c60cab2/galera/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.109659 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2a0d8448-cb6f-4fe4-9458-fad3bfd11471/openstackclient/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.331708 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8bw4c_828f5c5e-04c9-49c0-8056-7c930e756a44/openstack-network-exporter/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.533142 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nf4pk_4a527992-92f7-4aab-b8d4-e75ec72fd684/ovn-controller/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.745955 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovsdb-server-init/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.902530 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovsdb-server-init/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.928607 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovs-vswitchd/0.log" Sep 30 19:18:18 crc kubenswrapper[4797]: I0930 19:18:18.972991 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4225b925-9d11-4b4b-8e2d-1063584ef26c/nova-metadata-metadata/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.096774 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovsdb-server/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.206523 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dmdz9_3d61414a-adba-4fcd-b3ca-417935b2c4db/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.325627 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3139ddf0-f590-40e2-bd15-0af615d5cbf1/openstack-network-exporter/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.410356 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3139ddf0-f590-40e2-bd15-0af615d5cbf1/ovn-northd/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.626908 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df911ba-9b38-46e5-b779-3db695c839a9/ovsdbserver-nb/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.672789 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df911ba-9b38-46e5-b779-3db695c839a9/openstack-network-exporter/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.843369 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3e42915-c1cf-479d-8cb1-d337a4407d64/openstack-network-exporter/0.log" Sep 30 19:18:19 crc kubenswrapper[4797]: I0930 19:18:19.901832 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3e42915-c1cf-479d-8cb1-d337a4407d64/ovsdbserver-sb/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.248287 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:18:20 crc kubenswrapper[4797]: E0930 19:18:20.248736 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.322648 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c47f4984-nxfz7_cd02f0fa-36e8-4676-802c-37127e022ad0/placement-api/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.344988 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c47f4984-nxfz7_cd02f0fa-36e8-4676-802c-37127e022ad0/placement-log/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.543577 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/init-config-reloader/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.717036 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/config-reloader/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.718864 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/init-config-reloader/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.794220 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/prometheus/0.log" Sep 30 19:18:20 crc kubenswrapper[4797]: I0930 19:18:20.942174 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/thanos-sidecar/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.063469 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d/setup-container/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.191642 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d/setup-container/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.323249 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d/rabbitmq/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.426803 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7ffa7d5-6c5e-4d12-beb4-beca118f83d5/setup-container/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.596041 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7ffa7d5-6c5e-4d12-beb4-beca118f83d5/setup-container/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.694572 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7ffa7d5-6c5e-4d12-beb4-beca118f83d5/rabbitmq/0.log" Sep 30 19:18:21 crc kubenswrapper[4797]: I0930 19:18:21.785292 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb_1c7d707f-71bd-4194-b7c2-14a592f9772c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.040821 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-j5hr2_d8595051-9106-4b9a-bc5a-0a3e2e6ad11f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.234314 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh_bff66612-d0f8-4159-a096-478975f4d2e5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.433426 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-grthd_9b428893-75cc-423a-8ce8-31ccc2068037/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.591132 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rwq96_1976056c-3312-40a7-b5ae-f287c229e0a3/ssh-known-hosts-edpm-deployment/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.791506 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-594ff6944c-p2jp5_89ca411e-ead4-4a2d-9eba-f3f8fffcad46/proxy-server/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.905347 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-594ff6944c-p2jp5_89ca411e-ead4-4a2d-9eba-f3f8fffcad46/proxy-httpd/0.log" Sep 30 19:18:22 crc kubenswrapper[4797]: I0930 19:18:22.971922 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cx5sg_139c3278-4f30-418b-ae01-2ea9ac63ab55/swift-ring-rebalance/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.073348 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-auditor/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.163046 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-reaper/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.313552 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-replicator/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.364275 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-server/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.377658 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-auditor/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.578513 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-updater/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.594062 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-replicator/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.647018 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-server/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.808585 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-auditor/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.830169 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-expirer/0.log" Sep 30 19:18:23 crc kubenswrapper[4797]: I0930 19:18:23.955014 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-replicator/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.008145 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-server/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.061812 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-updater/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.145618 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/rsync/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.214653 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/swift-recon-cron/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.441612 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm_287e8ba5-a33d-49ac-bd3f-b85dd5a401a3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.574917 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_05275916-b3be-4d53-8a06-ab3d5c8b3f7b/tempest-tests-tempest-tests-runner/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.660733 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2754f75a-653d-4ec9-8bac-e81f0353ec88/test-operator-logs-container/0.log" Sep 30 19:18:24 crc kubenswrapper[4797]: I0930 19:18:24.873112 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-fkswn_14617993-cbb9-43c8-9ec5-d3a3afb1bc19/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:18:26 crc kubenswrapper[4797]: I0930 19:18:26.006582 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_a04de7be-0f64-475b-8f90-5fb466645c02/watcher-applier/0.log" Sep 30 19:18:26 crc kubenswrapper[4797]: I0930 19:18:26.123199 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_84edad16-e218-42bd-bba3-77d16184436c/watcher-api-log/0.log" Sep 30 19:18:27 crc kubenswrapper[4797]: I0930 19:18:27.503198 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_103ee950-a749-41ce-be1e-bdfb715bc7ad/watcher-decision-engine/0.log" Sep 30 19:18:27 crc kubenswrapper[4797]: I0930 19:18:27.767660 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6e728b8d-50b8-43df-bb8c-e3cbfce614e9/memcached/0.log" Sep 30 19:18:29 crc kubenswrapper[4797]: I0930 19:18:29.097459 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_84edad16-e218-42bd-bba3-77d16184436c/watcher-api/0.log" Sep 30 19:18:31 crc kubenswrapper[4797]: I0930 19:18:31.239208 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:18:31 crc kubenswrapper[4797]: E0930 19:18:31.239842 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:18:44 crc kubenswrapper[4797]: I0930 19:18:44.239274 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:18:44 crc kubenswrapper[4797]: E0930 19:18:44.240648 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:18:58 crc kubenswrapper[4797]: I0930 19:18:58.238579 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:18:58 crc kubenswrapper[4797]: E0930 19:18:58.239490 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:19:06 crc kubenswrapper[4797]: I0930 19:19:06.807345 4797 generic.go:334] "Generic (PLEG): container finished" podID="3f4892bf-2e95-498a-8577-e581647ce90d" containerID="17047573d5c6f98dd133f7068d76e4e3757d2f21c0e8cece83f4cf8d255dfd60" exitCode=0 Sep 30 19:19:06 crc kubenswrapper[4797]: I0930 19:19:06.807402 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-vplll" event={"ID":"3f4892bf-2e95-498a-8577-e581647ce90d","Type":"ContainerDied","Data":"17047573d5c6f98dd133f7068d76e4e3757d2f21c0e8cece83f4cf8d255dfd60"} Sep 30 19:19:07 crc kubenswrapper[4797]: I0930 19:19:07.925548 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:19:07 crc kubenswrapper[4797]: I0930 19:19:07.960386 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-vplll"] Sep 30 19:19:07 crc kubenswrapper[4797]: I0930 19:19:07.970110 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-vplll"] Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.051349 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4892bf-2e95-498a-8577-e581647ce90d-host\") pod \"3f4892bf-2e95-498a-8577-e581647ce90d\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.051539 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f4892bf-2e95-498a-8577-e581647ce90d-host" (OuterVolumeSpecName: "host") pod "3f4892bf-2e95-498a-8577-e581647ce90d" (UID: "3f4892bf-2e95-498a-8577-e581647ce90d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.051694 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k8bd\" (UniqueName: \"kubernetes.io/projected/3f4892bf-2e95-498a-8577-e581647ce90d-kube-api-access-5k8bd\") pod \"3f4892bf-2e95-498a-8577-e581647ce90d\" (UID: \"3f4892bf-2e95-498a-8577-e581647ce90d\") " Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.052381 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4892bf-2e95-498a-8577-e581647ce90d-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.058787 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4892bf-2e95-498a-8577-e581647ce90d-kube-api-access-5k8bd" (OuterVolumeSpecName: "kube-api-access-5k8bd") pod "3f4892bf-2e95-498a-8577-e581647ce90d" (UID: "3f4892bf-2e95-498a-8577-e581647ce90d"). InnerVolumeSpecName "kube-api-access-5k8bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.154384 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k8bd\" (UniqueName: \"kubernetes.io/projected/3f4892bf-2e95-498a-8577-e581647ce90d-kube-api-access-5k8bd\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.254184 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4892bf-2e95-498a-8577-e581647ce90d" path="/var/lib/kubelet/pods/3f4892bf-2e95-498a-8577-e581647ce90d/volumes" Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.828422 4797 scope.go:117] "RemoveContainer" containerID="17047573d5c6f98dd133f7068d76e4e3757d2f21c0e8cece83f4cf8d255dfd60" Sep 30 19:19:08 crc kubenswrapper[4797]: I0930 19:19:08.828511 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-vplll" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.169231 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-pcm9x"] Sep 30 19:19:09 crc kubenswrapper[4797]: E0930 19:19:09.171022 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="extract-content" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.171096 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="extract-content" Sep 30 19:19:09 crc kubenswrapper[4797]: E0930 19:19:09.171131 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="registry-server" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.171142 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="registry-server" Sep 30 19:19:09 crc kubenswrapper[4797]: E0930 19:19:09.171197 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4892bf-2e95-498a-8577-e581647ce90d" containerName="container-00" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.171211 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4892bf-2e95-498a-8577-e581647ce90d" containerName="container-00" Sep 30 19:19:09 crc kubenswrapper[4797]: E0930 19:19:09.171274 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="extract-utilities" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.171284 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="extract-utilities" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.172021 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="5871517d-6d00-4437-8826-01e7e49b68c7" containerName="registry-server" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.172063 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4892bf-2e95-498a-8577-e581647ce90d" containerName="container-00" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.173712 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.277167 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-host\") pod \"crc-debug-pcm9x\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.277778 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9jc\" (UniqueName: \"kubernetes.io/projected/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-kube-api-access-xs9jc\") pod \"crc-debug-pcm9x\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.379550 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9jc\" (UniqueName: \"kubernetes.io/projected/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-kube-api-access-xs9jc\") pod \"crc-debug-pcm9x\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.380091 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-host\") pod \"crc-debug-pcm9x\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.380480 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-host\") pod \"crc-debug-pcm9x\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.402673 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9jc\" (UniqueName: \"kubernetes.io/projected/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-kube-api-access-xs9jc\") pod \"crc-debug-pcm9x\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.503322 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.843940 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" event={"ID":"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab","Type":"ContainerStarted","Data":"c3831d27843532d14f3c2a620aee1d1d6e4b53d325f00d08ede2998740e2f31d"} Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.844293 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" event={"ID":"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab","Type":"ContainerStarted","Data":"d2f907377aa9cf23735d86909ec98dd7a61724b50a51270ebb41c092fcbece77"} Sep 30 19:19:09 crc kubenswrapper[4797]: I0930 19:19:09.860924 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" podStartSLOduration=0.860906401 podStartE2EDuration="860.906401ms" podCreationTimestamp="2025-09-30 19:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:19:09.856886751 +0000 UTC m=+5800.379385989" watchObservedRunningTime="2025-09-30 19:19:09.860906401 +0000 UTC m=+5800.383405639" Sep 30 19:19:10 crc kubenswrapper[4797]: I0930 19:19:10.851413 4797 generic.go:334] "Generic (PLEG): container finished" podID="8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" containerID="c3831d27843532d14f3c2a620aee1d1d6e4b53d325f00d08ede2998740e2f31d" exitCode=0 Sep 30 19:19:10 crc kubenswrapper[4797]: I0930 19:19:10.851473 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" event={"ID":"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab","Type":"ContainerDied","Data":"c3831d27843532d14f3c2a620aee1d1d6e4b53d325f00d08ede2998740e2f31d"} Sep 30 19:19:11 crc kubenswrapper[4797]: I0930 19:19:11.237546 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:19:11 crc kubenswrapper[4797]: E0930 19:19:11.237764 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:19:11 crc kubenswrapper[4797]: I0930 19:19:11.992417 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.130263 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-host\") pod \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.130468 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-host" (OuterVolumeSpecName: "host") pod "8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" (UID: "8bd2fa5c-4108-4bc3-935a-51d32a9c65ab"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.130892 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs9jc\" (UniqueName: \"kubernetes.io/projected/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-kube-api-access-xs9jc\") pod \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\" (UID: \"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab\") " Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.131605 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.141837 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-kube-api-access-xs9jc" (OuterVolumeSpecName: "kube-api-access-xs9jc") pod "8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" (UID: "8bd2fa5c-4108-4bc3-935a-51d32a9c65ab"). InnerVolumeSpecName "kube-api-access-xs9jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.233094 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs9jc\" (UniqueName: \"kubernetes.io/projected/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab-kube-api-access-xs9jc\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.899473 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" event={"ID":"8bd2fa5c-4108-4bc3-935a-51d32a9c65ab","Type":"ContainerDied","Data":"d2f907377aa9cf23735d86909ec98dd7a61724b50a51270ebb41c092fcbece77"} Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.899548 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f907377aa9cf23735d86909ec98dd7a61724b50a51270ebb41c092fcbece77" Sep 30 19:19:12 crc kubenswrapper[4797]: I0930 19:19:12.899611 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-pcm9x" Sep 30 19:19:20 crc kubenswrapper[4797]: I0930 19:19:20.131803 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-pcm9x"] Sep 30 19:19:20 crc kubenswrapper[4797]: I0930 19:19:20.142559 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-pcm9x"] Sep 30 19:19:20 crc kubenswrapper[4797]: I0930 19:19:20.250990 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" path="/var/lib/kubelet/pods/8bd2fa5c-4108-4bc3-935a-51d32a9c65ab/volumes" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.307676 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-wv7d6"] Sep 30 19:19:21 crc kubenswrapper[4797]: E0930 19:19:21.308465 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" containerName="container-00" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.308478 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" containerName="container-00" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.308668 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd2fa5c-4108-4bc3-935a-51d32a9c65ab" containerName="container-00" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.309358 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.384559 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7khtp\" (UniqueName: \"kubernetes.io/projected/bc9e853e-7446-4d17-99ce-eda1f398111f-kube-api-access-7khtp\") pod \"crc-debug-wv7d6\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.384734 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc9e853e-7446-4d17-99ce-eda1f398111f-host\") pod \"crc-debug-wv7d6\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.487302 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7khtp\" (UniqueName: \"kubernetes.io/projected/bc9e853e-7446-4d17-99ce-eda1f398111f-kube-api-access-7khtp\") pod \"crc-debug-wv7d6\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.487595 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc9e853e-7446-4d17-99ce-eda1f398111f-host\") pod \"crc-debug-wv7d6\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.488047 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc9e853e-7446-4d17-99ce-eda1f398111f-host\") pod \"crc-debug-wv7d6\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.514646 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7khtp\" (UniqueName: \"kubernetes.io/projected/bc9e853e-7446-4d17-99ce-eda1f398111f-kube-api-access-7khtp\") pod \"crc-debug-wv7d6\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.632262 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:21 crc kubenswrapper[4797]: W0930 19:19:21.663121 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc9e853e_7446_4d17_99ce_eda1f398111f.slice/crio-92174ac7381b583a1a2a6aa3f130be013ed21bf752b367bc69f748be782e524c WatchSource:0}: Error finding container 92174ac7381b583a1a2a6aa3f130be013ed21bf752b367bc69f748be782e524c: Status 404 returned error can't find the container with id 92174ac7381b583a1a2a6aa3f130be013ed21bf752b367bc69f748be782e524c Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.991412 4797 generic.go:334] "Generic (PLEG): container finished" podID="bc9e853e-7446-4d17-99ce-eda1f398111f" containerID="a93bfc6c59ecb12e4ef854f33879a423b89fc594749cb1c0cdfa4baa5ead6f04" exitCode=0 Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.991490 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" event={"ID":"bc9e853e-7446-4d17-99ce-eda1f398111f","Type":"ContainerDied","Data":"a93bfc6c59ecb12e4ef854f33879a423b89fc594749cb1c0cdfa4baa5ead6f04"} Sep 30 19:19:21 crc kubenswrapper[4797]: I0930 19:19:21.991896 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" event={"ID":"bc9e853e-7446-4d17-99ce-eda1f398111f","Type":"ContainerStarted","Data":"92174ac7381b583a1a2a6aa3f130be013ed21bf752b367bc69f748be782e524c"} Sep 30 19:19:22 crc kubenswrapper[4797]: I0930 19:19:22.034605 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-wv7d6"] Sep 30 19:19:22 crc kubenswrapper[4797]: I0930 19:19:22.042825 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7qqgx/crc-debug-wv7d6"] Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.129599 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.221883 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7khtp\" (UniqueName: \"kubernetes.io/projected/bc9e853e-7446-4d17-99ce-eda1f398111f-kube-api-access-7khtp\") pod \"bc9e853e-7446-4d17-99ce-eda1f398111f\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.221951 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc9e853e-7446-4d17-99ce-eda1f398111f-host\") pod \"bc9e853e-7446-4d17-99ce-eda1f398111f\" (UID: \"bc9e853e-7446-4d17-99ce-eda1f398111f\") " Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.222038 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc9e853e-7446-4d17-99ce-eda1f398111f-host" (OuterVolumeSpecName: "host") pod "bc9e853e-7446-4d17-99ce-eda1f398111f" (UID: "bc9e853e-7446-4d17-99ce-eda1f398111f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.222598 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc9e853e-7446-4d17-99ce-eda1f398111f-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.235638 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9e853e-7446-4d17-99ce-eda1f398111f-kube-api-access-7khtp" (OuterVolumeSpecName: "kube-api-access-7khtp") pod "bc9e853e-7446-4d17-99ce-eda1f398111f" (UID: "bc9e853e-7446-4d17-99ce-eda1f398111f"). InnerVolumeSpecName "kube-api-access-7khtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.324194 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7khtp\" (UniqueName: \"kubernetes.io/projected/bc9e853e-7446-4d17-99ce-eda1f398111f-kube-api-access-7khtp\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:23 crc kubenswrapper[4797]: I0930 19:19:23.797289 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/util/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.012316 4797 scope.go:117] "RemoveContainer" containerID="a93bfc6c59ecb12e4ef854f33879a423b89fc594749cb1c0cdfa4baa5ead6f04" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.012354 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/crc-debug-wv7d6" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.048292 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/util/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.071036 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/pull/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.086303 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/pull/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.252229 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9e853e-7446-4d17-99ce-eda1f398111f" path="/var/lib/kubelet/pods/bc9e853e-7446-4d17-99ce-eda1f398111f/volumes" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.265550 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/extract/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.270905 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/pull/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.277132 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/util/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.481417 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-csf68_3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa/kube-rbac-proxy/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.567484 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-wvthj_481318fa-263c-4a4b-b775-879776670ddb/kube-rbac-proxy/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.579532 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-csf68_3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa/manager/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.758967 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-85wtx_89b215ec-763f-4eb9-aef0-7f5b1d43481d/manager/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.788330 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-85wtx_89b215ec-763f-4eb9-aef0-7f5b1d43481d/kube-rbac-proxy/0.log" Sep 30 19:19:24 crc kubenswrapper[4797]: I0930 19:19:24.800568 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-wvthj_481318fa-263c-4a4b-b775-879776670ddb/manager/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.004195 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-585mg_72a082c8-41b8-4666-bdd1-8f998dedc4c3/kube-rbac-proxy/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.070308 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-585mg_72a082c8-41b8-4666-bdd1-8f998dedc4c3/manager/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.175655 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h2z55_c3c39950-97e6-423c-8884-b65548f38830/kube-rbac-proxy/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.232884 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h2z55_c3c39950-97e6-423c-8884-b65548f38830/manager/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.298059 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-d2mm8_07a9edee-b2ec-48d8-85b3-191f2f29bf73/kube-rbac-proxy/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.380152 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-d2mm8_07a9edee-b2ec-48d8-85b3-191f2f29bf73/manager/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.506364 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-9zcc2_1c601f06-9978-4f2b-8f37-2fa1bef8e8dd/kube-rbac-proxy/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.694336 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-9zcc2_1c601f06-9978-4f2b-8f37-2fa1bef8e8dd/manager/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.709866 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4wpww_9f9e430f-f1af-46a5-9885-2e25473d376d/kube-rbac-proxy/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.759799 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4wpww_9f9e430f-f1af-46a5-9885-2e25473d376d/manager/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.937334 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-slc6t_8824a3d0-28dc-42eb-b767-b9425f556076/kube-rbac-proxy/0.log" Sep 30 19:19:25 crc kubenswrapper[4797]: I0930 19:19:25.982175 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-slc6t_8824a3d0-28dc-42eb-b767-b9425f556076/manager/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.104832 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-vlc4g_24d8b2e3-5124-4bac-8cb1-871daabad7e6/kube-rbac-proxy/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.152014 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-vlc4g_24d8b2e3-5124-4bac-8cb1-871daabad7e6/manager/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.215148 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n564q_65d324d6-26a4-4a59-a29d-a92cad26a07a/kube-rbac-proxy/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.243013 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:19:26 crc kubenswrapper[4797]: E0930 19:19:26.243772 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.334325 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n564q_65d324d6-26a4-4a59-a29d-a92cad26a07a/manager/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.409367 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-qlrv8_9508eece-17ea-4b43-9bdb-6c2f8da6e21f/kube-rbac-proxy/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.480129 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-qlrv8_9508eece-17ea-4b43-9bdb-6c2f8da6e21f/manager/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.622119 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-d7r9l_04f5a5c7-892a-4aa5-8e21-ec847d9e29fb/kube-rbac-proxy/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.725184 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-d7r9l_04f5a5c7-892a-4aa5-8e21-ec847d9e29fb/manager/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.790665 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-c58zh_3195fff1-53f5-491a-869b-0f7fc5e45df6/kube-rbac-proxy/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.896621 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-c58zh_3195fff1-53f5-491a-869b-0f7fc5e45df6/manager/0.log" Sep 30 19:19:26 crc kubenswrapper[4797]: I0930 19:19:26.936034 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-zj2w6_d69ffa93-8979-4922-8aee-7ea26fede6b4/kube-rbac-proxy/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.003122 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-zj2w6_d69ffa93-8979-4922-8aee-7ea26fede6b4/manager/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.135662 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dc7c668c-46p2f_a618912d-66f8-4486-8e69-d3dc16f3cb34/kube-rbac-proxy/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.396622 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5c6649c9b9-x9zfh_3192f5d1-b6e4-4471-9adb-24c613a970f4/kube-rbac-proxy/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.511957 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qkts8_43506dea-1282-457a-9201-a2c9f9baa6f3/registry-server/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.562361 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5c6649c9b9-x9zfh_3192f5d1-b6e4-4471-9adb-24c613a970f4/operator/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.722798 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-dbxcb_05a4965a-f8d6-4859-ab5d-87773f6f6981/kube-rbac-proxy/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.794214 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-dbxcb_05a4965a-f8d6-4859-ab5d-87773f6f6981/manager/0.log" Sep 30 19:19:27 crc kubenswrapper[4797]: I0930 19:19:27.978623 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-xh5rb_8a1ceaa0-b6e6-442d-84d0-3fba075b136c/kube-rbac-proxy/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.201083 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-gcwrs_ca63e090-a37a-4150-9c58-edf133c74c99/operator/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.203398 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-xh5rb_8a1ceaa0-b6e6-442d-84d0-3fba075b136c/manager/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.303849 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-grp8f_c9c41380-c9ee-4467-b343-0f6cf78d51bc/kube-rbac-proxy/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.400462 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dc7c668c-46p2f_a618912d-66f8-4486-8e69-d3dc16f3cb34/manager/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.453638 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-grp8f_c9c41380-c9ee-4467-b343-0f6cf78d51bc/manager/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.545139 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5vhf6_ff0b391c-ac01-4a17-9381-a1e2b00d044d/kube-rbac-proxy/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.680160 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-66ff6_be859676-32a8-4144-94fd-ab0da94ce6bc/kube-rbac-proxy/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.715827 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-66ff6_be859676-32a8-4144-94fd-ab0da94ce6bc/manager/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.842830 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5vhf6_ff0b391c-ac01-4a17-9381-a1e2b00d044d/manager/0.log" Sep 30 19:19:28 crc kubenswrapper[4797]: I0930 19:19:28.936324 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-598db9dcc9-jbsh8_508a8f28-1d71-43ac-b24b-65f226abf807/kube-rbac-proxy/0.log" Sep 30 19:19:29 crc kubenswrapper[4797]: I0930 19:19:29.028853 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-598db9dcc9-jbsh8_508a8f28-1d71-43ac-b24b-65f226abf807/manager/0.log" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.302739 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tz2v6"] Sep 30 19:19:34 crc kubenswrapper[4797]: E0930 19:19:34.304445 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9e853e-7446-4d17-99ce-eda1f398111f" containerName="container-00" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.304462 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9e853e-7446-4d17-99ce-eda1f398111f" containerName="container-00" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.304740 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9e853e-7446-4d17-99ce-eda1f398111f" containerName="container-00" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.307128 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.321784 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tz2v6"] Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.379786 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-catalog-content\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.379882 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-utilities\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.379925 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckfk\" (UniqueName: \"kubernetes.io/projected/a6551fe3-4cee-489f-9afc-2f884cb9cf00-kube-api-access-bckfk\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.481834 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-catalog-content\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.481907 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-utilities\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.481940 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bckfk\" (UniqueName: \"kubernetes.io/projected/a6551fe3-4cee-489f-9afc-2f884cb9cf00-kube-api-access-bckfk\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.482719 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-catalog-content\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.482833 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-utilities\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.500128 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckfk\" (UniqueName: \"kubernetes.io/projected/a6551fe3-4cee-489f-9afc-2f884cb9cf00-kube-api-access-bckfk\") pod \"redhat-operators-tz2v6\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:34 crc kubenswrapper[4797]: I0930 19:19:34.633947 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:35 crc kubenswrapper[4797]: I0930 19:19:35.162064 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tz2v6"] Sep 30 19:19:36 crc kubenswrapper[4797]: I0930 19:19:36.140263 4797 generic.go:334] "Generic (PLEG): container finished" podID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerID="7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f" exitCode=0 Sep 30 19:19:36 crc kubenswrapper[4797]: I0930 19:19:36.140371 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerDied","Data":"7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f"} Sep 30 19:19:36 crc kubenswrapper[4797]: I0930 19:19:36.140562 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerStarted","Data":"aba437640f8d7f992cee3b0901ee6abb0cc2440b90c80c18e57d13bd86570c1a"} Sep 30 19:19:38 crc kubenswrapper[4797]: I0930 19:19:38.193799 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerStarted","Data":"7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2"} Sep 30 19:19:40 crc kubenswrapper[4797]: I0930 19:19:40.217031 4797 generic.go:334] "Generic (PLEG): container finished" podID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerID="7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2" exitCode=0 Sep 30 19:19:40 crc kubenswrapper[4797]: I0930 19:19:40.217121 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerDied","Data":"7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2"} Sep 30 19:19:40 crc kubenswrapper[4797]: I0930 19:19:40.243507 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:19:40 crc kubenswrapper[4797]: E0930 19:19:40.243773 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:19:41 crc kubenswrapper[4797]: I0930 19:19:41.230927 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerStarted","Data":"b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1"} Sep 30 19:19:41 crc kubenswrapper[4797]: I0930 19:19:41.257229 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tz2v6" podStartSLOduration=2.531148247 podStartE2EDuration="7.257209584s" podCreationTimestamp="2025-09-30 19:19:34 +0000 UTC" firstStartedPulling="2025-09-30 19:19:36.143824265 +0000 UTC m=+5826.666323503" lastFinishedPulling="2025-09-30 19:19:40.869885572 +0000 UTC m=+5831.392384840" observedRunningTime="2025-09-30 19:19:41.253007468 +0000 UTC m=+5831.775506716" watchObservedRunningTime="2025-09-30 19:19:41.257209584 +0000 UTC m=+5831.779708812" Sep 30 19:19:44 crc kubenswrapper[4797]: I0930 19:19:44.634274 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:44 crc kubenswrapper[4797]: I0930 19:19:44.634845 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:45 crc kubenswrapper[4797]: I0930 19:19:45.075794 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ndhrr_1595bfb1-9c13-4148-a2ae-075a0fb0e05b/control-plane-machine-set-operator/0.log" Sep 30 19:19:45 crc kubenswrapper[4797]: I0930 19:19:45.296546 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8p4lg_c0adfd2b-83f6-41ea-beee-cd0a5ac3973b/kube-rbac-proxy/0.log" Sep 30 19:19:45 crc kubenswrapper[4797]: I0930 19:19:45.343281 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8p4lg_c0adfd2b-83f6-41ea-beee-cd0a5ac3973b/machine-api-operator/0.log" Sep 30 19:19:45 crc kubenswrapper[4797]: I0930 19:19:45.721171 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tz2v6" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="registry-server" probeResult="failure" output=< Sep 30 19:19:45 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 19:19:45 crc kubenswrapper[4797]: > Sep 30 19:19:54 crc kubenswrapper[4797]: I0930 19:19:54.692811 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:54 crc kubenswrapper[4797]: I0930 19:19:54.770065 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:54 crc kubenswrapper[4797]: I0930 19:19:54.929827 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tz2v6"] Sep 30 19:19:55 crc kubenswrapper[4797]: I0930 19:19:55.239024 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:19:55 crc kubenswrapper[4797]: E0930 19:19:55.239293 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.379584 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tz2v6" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="registry-server" containerID="cri-o://b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1" gracePeriod=2 Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.904551 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.963128 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-catalog-content\") pod \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.963209 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-utilities\") pod \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.963300 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bckfk\" (UniqueName: \"kubernetes.io/projected/a6551fe3-4cee-489f-9afc-2f884cb9cf00-kube-api-access-bckfk\") pod \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\" (UID: \"a6551fe3-4cee-489f-9afc-2f884cb9cf00\") " Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.966036 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-utilities" (OuterVolumeSpecName: "utilities") pod "a6551fe3-4cee-489f-9afc-2f884cb9cf00" (UID: "a6551fe3-4cee-489f-9afc-2f884cb9cf00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:19:56 crc kubenswrapper[4797]: I0930 19:19:56.980734 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6551fe3-4cee-489f-9afc-2f884cb9cf00-kube-api-access-bckfk" (OuterVolumeSpecName: "kube-api-access-bckfk") pod "a6551fe3-4cee-489f-9afc-2f884cb9cf00" (UID: "a6551fe3-4cee-489f-9afc-2f884cb9cf00"). InnerVolumeSpecName "kube-api-access-bckfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.056391 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6551fe3-4cee-489f-9afc-2f884cb9cf00" (UID: "a6551fe3-4cee-489f-9afc-2f884cb9cf00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.065389 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.065424 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6551fe3-4cee-489f-9afc-2f884cb9cf00-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.065446 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bckfk\" (UniqueName: \"kubernetes.io/projected/a6551fe3-4cee-489f-9afc-2f884cb9cf00-kube-api-access-bckfk\") on node \"crc\" DevicePath \"\"" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.390036 4797 generic.go:334] "Generic (PLEG): container finished" podID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerID="b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1" exitCode=0 Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.390081 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerDied","Data":"b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1"} Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.390114 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz2v6" event={"ID":"a6551fe3-4cee-489f-9afc-2f884cb9cf00","Type":"ContainerDied","Data":"aba437640f8d7f992cee3b0901ee6abb0cc2440b90c80c18e57d13bd86570c1a"} Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.390104 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz2v6" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.390136 4797 scope.go:117] "RemoveContainer" containerID="b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.413934 4797 scope.go:117] "RemoveContainer" containerID="7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.426494 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tz2v6"] Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.432450 4797 scope.go:117] "RemoveContainer" containerID="7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.442592 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tz2v6"] Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.494357 4797 scope.go:117] "RemoveContainer" containerID="b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1" Sep 30 19:19:57 crc kubenswrapper[4797]: E0930 19:19:57.495002 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1\": container with ID starting with b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1 not found: ID does not exist" containerID="b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.495656 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1"} err="failed to get container status \"b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1\": rpc error: code = NotFound desc = could not find container \"b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1\": container with ID starting with b2ef9ee6817ce06f54d016b858acc4b95e24607d4fc936e83b197f74dbe991b1 not found: ID does not exist" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.495970 4797 scope.go:117] "RemoveContainer" containerID="7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2" Sep 30 19:19:57 crc kubenswrapper[4797]: E0930 19:19:57.496845 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2\": container with ID starting with 7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2 not found: ID does not exist" containerID="7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.496888 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2"} err="failed to get container status \"7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2\": rpc error: code = NotFound desc = could not find container \"7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2\": container with ID starting with 7a21bfc5f185d00bd5e6975317dda5b1780464188b821cf6661239338d028fc2 not found: ID does not exist" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.496915 4797 scope.go:117] "RemoveContainer" containerID="7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f" Sep 30 19:19:57 crc kubenswrapper[4797]: E0930 19:19:57.497351 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f\": container with ID starting with 7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f not found: ID does not exist" containerID="7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.497385 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f"} err="failed to get container status \"7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f\": rpc error: code = NotFound desc = could not find container \"7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f\": container with ID starting with 7d53f1177c9e30fc1b3373079827d49b2c3de7eb11bfa61fc809310b33fe3e3f not found: ID does not exist" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.711663 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2wj9r_9d1c6f31-57b5-4629-b6aa-abcf3394a4f4/cert-manager-controller/0.log" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.903858 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qj8wf_696a605e-f3af-4acb-941d-22aa927ba890/cert-manager-cainjector/0.log" Sep 30 19:19:57 crc kubenswrapper[4797]: I0930 19:19:57.927253 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-77l4k_ddd09658-492a-4954-8d85-7c05dbe4b5c4/cert-manager-webhook/0.log" Sep 30 19:19:58 crc kubenswrapper[4797]: I0930 19:19:58.257618 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" path="/var/lib/kubelet/pods/a6551fe3-4cee-489f-9afc-2f884cb9cf00/volumes" Sep 30 19:20:09 crc kubenswrapper[4797]: I0930 19:20:09.237868 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:20:09 crc kubenswrapper[4797]: E0930 19:20:09.238539 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:20:10 crc kubenswrapper[4797]: I0930 19:20:10.034146 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-g9n7q_aeadbff5-abfe-4f8c-a24f-e62db0f23612/nmstate-console-plugin/0.log" Sep 30 19:20:10 crc kubenswrapper[4797]: I0930 19:20:10.258456 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-x2tc2_ccc6478c-07c2-431f-a964-1db62dd3800e/nmstate-handler/0.log" Sep 30 19:20:10 crc kubenswrapper[4797]: I0930 19:20:10.264635 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-4nbj6_5e6982a1-49c0-428e-8b68-38899f1be907/kube-rbac-proxy/0.log" Sep 30 19:20:10 crc kubenswrapper[4797]: I0930 19:20:10.399610 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-4nbj6_5e6982a1-49c0-428e-8b68-38899f1be907/nmstate-metrics/0.log" Sep 30 19:20:10 crc kubenswrapper[4797]: I0930 19:20:10.454572 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-d2f5j_4ce57370-6b4e-4a2a-be84-6cea546156ac/nmstate-operator/0.log" Sep 30 19:20:10 crc kubenswrapper[4797]: I0930 19:20:10.572238 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-bxqvs_3417981c-8b7a-48f5-b504-c3a358706f7f/nmstate-webhook/0.log" Sep 30 19:20:23 crc kubenswrapper[4797]: I0930 19:20:23.238159 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:20:23 crc kubenswrapper[4797]: E0930 19:20:23.239022 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.253253 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-kr68l_9e16cc99-60f9-4551-a18e-17f9beeca400/kube-rbac-proxy/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.409507 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-kr68l_9e16cc99-60f9-4551-a18e-17f9beeca400/controller/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.470778 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.646664 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.690601 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.690620 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.710318 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.873190 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.873509 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.879402 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:20:24 crc kubenswrapper[4797]: I0930 19:20:24.910918 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.025372 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.067849 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.073950 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.100335 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/controller/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.247389 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/frr-metrics/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.286374 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/kube-rbac-proxy/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.293814 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/kube-rbac-proxy-frr/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.491619 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/reloader/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.527954 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-667c9_f491bbe4-c848-4384-a932-13d5242e5871/frr-k8s-webhook-server/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.782863 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cb9fbcf6-fj2qz_10a1e68f-3d10-4da9-82db-d1043c94bcd8/manager/0.log" Sep 30 19:20:25 crc kubenswrapper[4797]: I0930 19:20:25.924519 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f4f79dfb8-6s6zz_77965127-1121-47ad-96b5-34229a106e24/webhook-server/0.log" Sep 30 19:20:26 crc kubenswrapper[4797]: I0930 19:20:26.085423 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dsmgz_0acb5984-08fc-4f2f-95d1-e65ba209a2f6/kube-rbac-proxy/0.log" Sep 30 19:20:26 crc kubenswrapper[4797]: I0930 19:20:26.658951 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dsmgz_0acb5984-08fc-4f2f-95d1-e65ba209a2f6/speaker/0.log" Sep 30 19:20:26 crc kubenswrapper[4797]: I0930 19:20:26.937450 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/frr/0.log" Sep 30 19:20:38 crc kubenswrapper[4797]: I0930 19:20:38.238048 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:20:38 crc kubenswrapper[4797]: E0930 19:20:38.239063 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.221422 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/util/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.413372 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/pull/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.419808 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/util/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.450801 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/pull/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.609347 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/util/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.614266 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/extract/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.619013 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/pull/0.log" Sep 30 19:20:40 crc kubenswrapper[4797]: I0930 19:20:40.787035 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/util/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.018449 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/pull/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.033697 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/util/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.038133 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/pull/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.164154 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/util/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.207298 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/pull/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.285595 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/extract/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.364563 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-utilities/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.551109 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-content/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.554408 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-utilities/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.567954 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-content/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.756167 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-utilities/0.log" Sep 30 19:20:41 crc kubenswrapper[4797]: I0930 19:20:41.878917 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-content/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.042694 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-utilities/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.267066 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-content/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.269591 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-content/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.278299 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-utilities/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.342598 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/registry-server/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.476605 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-utilities/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.552386 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-content/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.691898 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/util/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.851242 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/util/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.947682 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/pull/0.log" Sep 30 19:20:42 crc kubenswrapper[4797]: I0930 19:20:42.999893 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/pull/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.148818 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/util/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.183702 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/pull/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.270808 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/extract/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.427412 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/registry-server/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.438616 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7qh78_e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2/marketplace-operator/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.591712 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-utilities/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.799394 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-content/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.828028 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-content/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.832763 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-utilities/0.log" Sep 30 19:20:43 crc kubenswrapper[4797]: I0930 19:20:43.964205 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-utilities/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.006677 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-content/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.012658 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-utilities/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.187039 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/registry-server/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.291255 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-content/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.330102 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-utilities/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.376770 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-content/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.612074 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-content/0.log" Sep 30 19:20:44 crc kubenswrapper[4797]: I0930 19:20:44.663382 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-utilities/0.log" Sep 30 19:20:45 crc kubenswrapper[4797]: I0930 19:20:45.385810 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/registry-server/0.log" Sep 30 19:20:52 crc kubenswrapper[4797]: I0930 19:20:52.239141 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:20:52 crc kubenswrapper[4797]: E0930 19:20:52.240691 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:20:57 crc kubenswrapper[4797]: I0930 19:20:57.711302 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-4rpbl_e3897355-dd17-4112-94fd-42c45c4cfa7f/prometheus-operator/0.log" Sep 30 19:20:57 crc kubenswrapper[4797]: I0930 19:20:57.887568 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh_d153021c-fc00-4c10-91dd-69c13423dd4d/prometheus-operator-admission-webhook/0.log" Sep 30 19:20:57 crc kubenswrapper[4797]: I0930 19:20:57.932886 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4_9ea30b0e-2f00-44a1-8e46-cc36ffc843a5/prometheus-operator-admission-webhook/0.log" Sep 30 19:20:58 crc kubenswrapper[4797]: I0930 19:20:58.088296 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-nvvj2_f3d6451a-ed07-4fc4-9ebe-a8c8d514999c/operator/0.log" Sep 30 19:20:58 crc kubenswrapper[4797]: I0930 19:20:58.138832 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-r4wwh_91658bbd-3b03-40ce-af08-985444e42376/perses-operator/0.log" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.172882 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9jvf"] Sep 30 19:21:02 crc kubenswrapper[4797]: E0930 19:21:02.174022 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="registry-server" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.174041 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="registry-server" Sep 30 19:21:02 crc kubenswrapper[4797]: E0930 19:21:02.174060 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="extract-utilities" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.174067 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="extract-utilities" Sep 30 19:21:02 crc kubenswrapper[4797]: E0930 19:21:02.174077 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="extract-content" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.174082 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="extract-content" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.174284 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6551fe3-4cee-489f-9afc-2f884cb9cf00" containerName="registry-server" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.175771 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.186980 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9jvf"] Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.214646 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-catalog-content\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.214800 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-utilities\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.214829 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2npdg\" (UniqueName: \"kubernetes.io/projected/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-kube-api-access-2npdg\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.316846 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-catalog-content\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.317409 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-utilities\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.317523 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2npdg\" (UniqueName: \"kubernetes.io/projected/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-kube-api-access-2npdg\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.318338 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-catalog-content\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.318710 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-utilities\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.338643 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2npdg\" (UniqueName: \"kubernetes.io/projected/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-kube-api-access-2npdg\") pod \"community-operators-b9jvf\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:02 crc kubenswrapper[4797]: I0930 19:21:02.506158 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:03 crc kubenswrapper[4797]: I0930 19:21:03.094841 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9jvf"] Sep 30 19:21:03 crc kubenswrapper[4797]: I0930 19:21:03.238170 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:21:03 crc kubenswrapper[4797]: E0930 19:21:03.238802 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:21:04 crc kubenswrapper[4797]: I0930 19:21:04.092826 4797 generic.go:334] "Generic (PLEG): container finished" podID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerID="f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07" exitCode=0 Sep 30 19:21:04 crc kubenswrapper[4797]: I0930 19:21:04.092880 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerDied","Data":"f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07"} Sep 30 19:21:04 crc kubenswrapper[4797]: I0930 19:21:04.093134 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerStarted","Data":"aa1496b578a6a7a02c182f1aa0f58e6c2d1b81854e60168e76173865cf4e13be"} Sep 30 19:21:04 crc kubenswrapper[4797]: I0930 19:21:04.094982 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:21:05 crc kubenswrapper[4797]: I0930 19:21:05.103913 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerStarted","Data":"74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416"} Sep 30 19:21:06 crc kubenswrapper[4797]: I0930 19:21:06.116044 4797 generic.go:334] "Generic (PLEG): container finished" podID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerID="74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416" exitCode=0 Sep 30 19:21:06 crc kubenswrapper[4797]: I0930 19:21:06.116178 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerDied","Data":"74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416"} Sep 30 19:21:07 crc kubenswrapper[4797]: I0930 19:21:07.127121 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerStarted","Data":"34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa"} Sep 30 19:21:07 crc kubenswrapper[4797]: I0930 19:21:07.148808 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9jvf" podStartSLOduration=2.670645186 podStartE2EDuration="5.148791321s" podCreationTimestamp="2025-09-30 19:21:02 +0000 UTC" firstStartedPulling="2025-09-30 19:21:04.094753853 +0000 UTC m=+5914.617253091" lastFinishedPulling="2025-09-30 19:21:06.572899988 +0000 UTC m=+5917.095399226" observedRunningTime="2025-09-30 19:21:07.1479898 +0000 UTC m=+5917.670489038" watchObservedRunningTime="2025-09-30 19:21:07.148791321 +0000 UTC m=+5917.671290559" Sep 30 19:21:12 crc kubenswrapper[4797]: I0930 19:21:12.506543 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:12 crc kubenswrapper[4797]: I0930 19:21:12.508175 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:13 crc kubenswrapper[4797]: I0930 19:21:13.556486 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b9jvf" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="registry-server" probeResult="failure" output=< Sep 30 19:21:13 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 19:21:13 crc kubenswrapper[4797]: > Sep 30 19:21:16 crc kubenswrapper[4797]: I0930 19:21:16.239023 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:21:16 crc kubenswrapper[4797]: E0930 19:21:16.239883 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:21:22 crc kubenswrapper[4797]: I0930 19:21:22.591498 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:22 crc kubenswrapper[4797]: I0930 19:21:22.675375 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:22 crc kubenswrapper[4797]: I0930 19:21:22.849277 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9jvf"] Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.317758 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9jvf" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="registry-server" containerID="cri-o://34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa" gracePeriod=2 Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.846963 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.969065 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-catalog-content\") pod \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.969563 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-utilities\") pod \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.969942 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2npdg\" (UniqueName: \"kubernetes.io/projected/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-kube-api-access-2npdg\") pod \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\" (UID: \"6f69b0dd-d96e-40b1-b5cc-085b4b55952f\") " Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.971065 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-utilities" (OuterVolumeSpecName: "utilities") pod "6f69b0dd-d96e-40b1-b5cc-085b4b55952f" (UID: "6f69b0dd-d96e-40b1-b5cc-085b4b55952f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:21:24 crc kubenswrapper[4797]: I0930 19:21:24.974921 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-kube-api-access-2npdg" (OuterVolumeSpecName: "kube-api-access-2npdg") pod "6f69b0dd-d96e-40b1-b5cc-085b4b55952f" (UID: "6f69b0dd-d96e-40b1-b5cc-085b4b55952f"). InnerVolumeSpecName "kube-api-access-2npdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.042038 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f69b0dd-d96e-40b1-b5cc-085b4b55952f" (UID: "6f69b0dd-d96e-40b1-b5cc-085b4b55952f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.073173 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2npdg\" (UniqueName: \"kubernetes.io/projected/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-kube-api-access-2npdg\") on node \"crc\" DevicePath \"\"" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.073210 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.073219 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f69b0dd-d96e-40b1-b5cc-085b4b55952f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.329632 4797 generic.go:334] "Generic (PLEG): container finished" podID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerID="34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa" exitCode=0 Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.329678 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerDied","Data":"34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa"} Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.329711 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9jvf" event={"ID":"6f69b0dd-d96e-40b1-b5cc-085b4b55952f","Type":"ContainerDied","Data":"aa1496b578a6a7a02c182f1aa0f58e6c2d1b81854e60168e76173865cf4e13be"} Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.329728 4797 scope.go:117] "RemoveContainer" containerID="34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.329734 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9jvf" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.350668 4797 scope.go:117] "RemoveContainer" containerID="74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.372983 4797 scope.go:117] "RemoveContainer" containerID="f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.381937 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9jvf"] Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.391748 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9jvf"] Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.427065 4797 scope.go:117] "RemoveContainer" containerID="34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa" Sep 30 19:21:25 crc kubenswrapper[4797]: E0930 19:21:25.427654 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa\": container with ID starting with 34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa not found: ID does not exist" containerID="34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.427701 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa"} err="failed to get container status \"34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa\": rpc error: code = NotFound desc = could not find container \"34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa\": container with ID starting with 34def6b5640f373992058258bf13e6ea96eab01b79f38926a5c81953ba367afa not found: ID does not exist" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.427728 4797 scope.go:117] "RemoveContainer" containerID="74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416" Sep 30 19:21:25 crc kubenswrapper[4797]: E0930 19:21:25.428107 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416\": container with ID starting with 74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416 not found: ID does not exist" containerID="74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.428280 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416"} err="failed to get container status \"74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416\": rpc error: code = NotFound desc = could not find container \"74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416\": container with ID starting with 74524bf30906105b2e0d9272b65c6c426d0c206bf04815ec2c8d1ce44fe6b416 not found: ID does not exist" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.428358 4797 scope.go:117] "RemoveContainer" containerID="f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07" Sep 30 19:21:25 crc kubenswrapper[4797]: E0930 19:21:25.428708 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07\": container with ID starting with f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07 not found: ID does not exist" containerID="f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07" Sep 30 19:21:25 crc kubenswrapper[4797]: I0930 19:21:25.428735 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07"} err="failed to get container status \"f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07\": rpc error: code = NotFound desc = could not find container \"f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07\": container with ID starting with f059686fc4bea8ba0d490a6dac1f5ec3e89fceefd183daa53a7ad93635003e07 not found: ID does not exist" Sep 30 19:21:26 crc kubenswrapper[4797]: I0930 19:21:26.258145 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" path="/var/lib/kubelet/pods/6f69b0dd-d96e-40b1-b5cc-085b4b55952f/volumes" Sep 30 19:21:28 crc kubenswrapper[4797]: I0930 19:21:28.238353 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:21:28 crc kubenswrapper[4797]: E0930 19:21:28.240129 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:21:41 crc kubenswrapper[4797]: I0930 19:21:41.239702 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:21:41 crc kubenswrapper[4797]: E0930 19:21:41.240748 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:21:53 crc kubenswrapper[4797]: I0930 19:21:53.238768 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:21:53 crc kubenswrapper[4797]: E0930 19:21:53.239733 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:22:07 crc kubenswrapper[4797]: I0930 19:22:07.237914 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:22:07 crc kubenswrapper[4797]: E0930 19:22:07.238600 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:22:22 crc kubenswrapper[4797]: I0930 19:22:22.239183 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:22:23 crc kubenswrapper[4797]: I0930 19:22:23.001026 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"75bc536e3104e25f05f9c42c898aa9d45566d6737d743e239442ca221fb257b6"} Sep 30 19:23:13 crc kubenswrapper[4797]: I0930 19:23:13.608722 4797 generic.go:334] "Generic (PLEG): container finished" podID="33788a93-3dcc-4003-be30-4fe40760f228" containerID="32b17e9e780f59b5b8cdbad93a293629412d4ba542ff374df6d9161a65b24db7" exitCode=0 Sep 30 19:23:13 crc kubenswrapper[4797]: I0930 19:23:13.608806 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" event={"ID":"33788a93-3dcc-4003-be30-4fe40760f228","Type":"ContainerDied","Data":"32b17e9e780f59b5b8cdbad93a293629412d4ba542ff374df6d9161a65b24db7"} Sep 30 19:23:13 crc kubenswrapper[4797]: I0930 19:23:13.610000 4797 scope.go:117] "RemoveContainer" containerID="32b17e9e780f59b5b8cdbad93a293629412d4ba542ff374df6d9161a65b24db7" Sep 30 19:23:14 crc kubenswrapper[4797]: I0930 19:23:14.007324 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7qqgx_must-gather-hzj2c_33788a93-3dcc-4003-be30-4fe40760f228/gather/0.log" Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.209366 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7qqgx/must-gather-hzj2c"] Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.210083 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="copy" containerID="cri-o://6afe6e3f75c02423708ccf1a0c1e44197abb51d475b6b583ab3cc71792a6e1b6" gracePeriod=2 Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.219666 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7qqgx/must-gather-hzj2c"] Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.743517 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7qqgx_must-gather-hzj2c_33788a93-3dcc-4003-be30-4fe40760f228/copy/0.log" Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.746850 4797 generic.go:334] "Generic (PLEG): container finished" podID="33788a93-3dcc-4003-be30-4fe40760f228" containerID="6afe6e3f75c02423708ccf1a0c1e44197abb51d475b6b583ab3cc71792a6e1b6" exitCode=143 Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.746896 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad0a8ef3b147c43d207e29372f82a6761db6627160a5c9b30da1706c776eae9" Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.747873 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7qqgx_must-gather-hzj2c_33788a93-3dcc-4003-be30-4fe40760f228/copy/0.log" Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.748113 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.878350 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33788a93-3dcc-4003-be30-4fe40760f228-must-gather-output\") pod \"33788a93-3dcc-4003-be30-4fe40760f228\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.878647 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8tw2\" (UniqueName: \"kubernetes.io/projected/33788a93-3dcc-4003-be30-4fe40760f228-kube-api-access-f8tw2\") pod \"33788a93-3dcc-4003-be30-4fe40760f228\" (UID: \"33788a93-3dcc-4003-be30-4fe40760f228\") " Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.899153 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33788a93-3dcc-4003-be30-4fe40760f228-kube-api-access-f8tw2" (OuterVolumeSpecName: "kube-api-access-f8tw2") pod "33788a93-3dcc-4003-be30-4fe40760f228" (UID: "33788a93-3dcc-4003-be30-4fe40760f228"). InnerVolumeSpecName "kube-api-access-f8tw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:23:23 crc kubenswrapper[4797]: I0930 19:23:23.980968 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8tw2\" (UniqueName: \"kubernetes.io/projected/33788a93-3dcc-4003-be30-4fe40760f228-kube-api-access-f8tw2\") on node \"crc\" DevicePath \"\"" Sep 30 19:23:24 crc kubenswrapper[4797]: I0930 19:23:24.064961 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33788a93-3dcc-4003-be30-4fe40760f228-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "33788a93-3dcc-4003-be30-4fe40760f228" (UID: "33788a93-3dcc-4003-be30-4fe40760f228"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:23:24 crc kubenswrapper[4797]: I0930 19:23:24.083102 4797 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33788a93-3dcc-4003-be30-4fe40760f228-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 19:23:24 crc kubenswrapper[4797]: I0930 19:23:24.248925 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33788a93-3dcc-4003-be30-4fe40760f228" path="/var/lib/kubelet/pods/33788a93-3dcc-4003-be30-4fe40760f228/volumes" Sep 30 19:23:24 crc kubenswrapper[4797]: I0930 19:23:24.758307 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7qqgx/must-gather-hzj2c" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.686967 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5hqg/must-gather-z28w4"] Sep 30 19:24:08 crc kubenswrapper[4797]: E0930 19:24:08.688506 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="extract-content" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688528 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="extract-content" Sep 30 19:24:08 crc kubenswrapper[4797]: E0930 19:24:08.688545 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="gather" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688553 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="gather" Sep 30 19:24:08 crc kubenswrapper[4797]: E0930 19:24:08.688587 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="copy" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688594 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="copy" Sep 30 19:24:08 crc kubenswrapper[4797]: E0930 19:24:08.688630 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="extract-utilities" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688640 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="extract-utilities" Sep 30 19:24:08 crc kubenswrapper[4797]: E0930 19:24:08.688673 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="registry-server" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688678 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="registry-server" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688921 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="copy" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688944 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f69b0dd-d96e-40b1-b5cc-085b4b55952f" containerName="registry-server" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.688969 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="33788a93-3dcc-4003-be30-4fe40760f228" containerName="gather" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.691023 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.695021 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c5hqg"/"kube-root-ca.crt" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.703327 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c5hqg"/"openshift-service-ca.crt" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.715070 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c5hqg/must-gather-z28w4"] Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.874444 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b796179f-ac01-4663-b13f-ba4506fa30ec-must-gather-output\") pod \"must-gather-z28w4\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.874552 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbn8\" (UniqueName: \"kubernetes.io/projected/b796179f-ac01-4663-b13f-ba4506fa30ec-kube-api-access-kfbn8\") pod \"must-gather-z28w4\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.976003 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbn8\" (UniqueName: \"kubernetes.io/projected/b796179f-ac01-4663-b13f-ba4506fa30ec-kube-api-access-kfbn8\") pod \"must-gather-z28w4\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.976143 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b796179f-ac01-4663-b13f-ba4506fa30ec-must-gather-output\") pod \"must-gather-z28w4\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.976564 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b796179f-ac01-4663-b13f-ba4506fa30ec-must-gather-output\") pod \"must-gather-z28w4\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:08 crc kubenswrapper[4797]: I0930 19:24:08.998550 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbn8\" (UniqueName: \"kubernetes.io/projected/b796179f-ac01-4663-b13f-ba4506fa30ec-kube-api-access-kfbn8\") pod \"must-gather-z28w4\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:09 crc kubenswrapper[4797]: I0930 19:24:09.025720 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:24:09 crc kubenswrapper[4797]: I0930 19:24:09.584628 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c5hqg/must-gather-z28w4"] Sep 30 19:24:10 crc kubenswrapper[4797]: I0930 19:24:10.290798 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/must-gather-z28w4" event={"ID":"b796179f-ac01-4663-b13f-ba4506fa30ec","Type":"ContainerStarted","Data":"4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c"} Sep 30 19:24:10 crc kubenswrapper[4797]: I0930 19:24:10.291935 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/must-gather-z28w4" event={"ID":"b796179f-ac01-4663-b13f-ba4506fa30ec","Type":"ContainerStarted","Data":"3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db"} Sep 30 19:24:10 crc kubenswrapper[4797]: I0930 19:24:10.292010 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/must-gather-z28w4" event={"ID":"b796179f-ac01-4663-b13f-ba4506fa30ec","Type":"ContainerStarted","Data":"a9d84f9d13db84d578d65e03a78331aaa61ea41c189c919354eadf4380f081b4"} Sep 30 19:24:10 crc kubenswrapper[4797]: I0930 19:24:10.312328 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c5hqg/must-gather-z28w4" podStartSLOduration=2.3123055040000002 podStartE2EDuration="2.312305504s" podCreationTimestamp="2025-09-30 19:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:24:10.306123354 +0000 UTC m=+6100.828622602" watchObservedRunningTime="2025-09-30 19:24:10.312305504 +0000 UTC m=+6100.834804732" Sep 30 19:24:12 crc kubenswrapper[4797]: E0930 19:24:12.505541 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:37634->38.102.83.119:33393: write tcp 38.102.83.119:37634->38.102.83.119:33393: write: broken pipe Sep 30 19:24:12 crc kubenswrapper[4797]: E0930 19:24:12.804877 4797 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:37688->38.102.83.119:33393: write tcp 38.102.83.119:37688->38.102.83.119:33393: write: broken pipe Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.619348 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-zdp6x"] Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.621393 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.623668 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c5hqg"/"default-dockercfg-hv5jv" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.676553 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck27v\" (UniqueName: \"kubernetes.io/projected/df1a253f-8d96-4437-953b-0c4285227c36-kube-api-access-ck27v\") pod \"crc-debug-zdp6x\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.676909 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df1a253f-8d96-4437-953b-0c4285227c36-host\") pod \"crc-debug-zdp6x\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.779006 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck27v\" (UniqueName: \"kubernetes.io/projected/df1a253f-8d96-4437-953b-0c4285227c36-kube-api-access-ck27v\") pod \"crc-debug-zdp6x\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.779106 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df1a253f-8d96-4437-953b-0c4285227c36-host\") pod \"crc-debug-zdp6x\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.779359 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df1a253f-8d96-4437-953b-0c4285227c36-host\") pod \"crc-debug-zdp6x\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.797795 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck27v\" (UniqueName: \"kubernetes.io/projected/df1a253f-8d96-4437-953b-0c4285227c36-kube-api-access-ck27v\") pod \"crc-debug-zdp6x\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: I0930 19:24:13.942070 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:24:13 crc kubenswrapper[4797]: W0930 19:24:13.983922 4797 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf1a253f_8d96_4437_953b_0c4285227c36.slice/crio-bd4e913100c2432817568d626500a16c33052b60064319b9afb32723dbdac37b WatchSource:0}: Error finding container bd4e913100c2432817568d626500a16c33052b60064319b9afb32723dbdac37b: Status 404 returned error can't find the container with id bd4e913100c2432817568d626500a16c33052b60064319b9afb32723dbdac37b Sep 30 19:24:14 crc kubenswrapper[4797]: I0930 19:24:14.340545 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" event={"ID":"df1a253f-8d96-4437-953b-0c4285227c36","Type":"ContainerStarted","Data":"4061ebdca6566b378517452e2a94ba2954e63cf30ed7665a63605af65acb90d0"} Sep 30 19:24:14 crc kubenswrapper[4797]: I0930 19:24:14.341373 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" event={"ID":"df1a253f-8d96-4437-953b-0c4285227c36","Type":"ContainerStarted","Data":"bd4e913100c2432817568d626500a16c33052b60064319b9afb32723dbdac37b"} Sep 30 19:24:14 crc kubenswrapper[4797]: I0930 19:24:14.360699 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" podStartSLOduration=1.36067477 podStartE2EDuration="1.36067477s" podCreationTimestamp="2025-09-30 19:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:24:14.355742966 +0000 UTC m=+6104.878242204" watchObservedRunningTime="2025-09-30 19:24:14.36067477 +0000 UTC m=+6104.883174018" Sep 30 19:24:18 crc kubenswrapper[4797]: I0930 19:24:18.665714 4797 scope.go:117] "RemoveContainer" containerID="32b17e9e780f59b5b8cdbad93a293629412d4ba542ff374df6d9161a65b24db7" Sep 30 19:24:18 crc kubenswrapper[4797]: I0930 19:24:18.824746 4797 scope.go:117] "RemoveContainer" containerID="6afe6e3f75c02423708ccf1a0c1e44197abb51d475b6b583ab3cc71792a6e1b6" Sep 30 19:24:44 crc kubenswrapper[4797]: I0930 19:24:44.191491 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:24:44 crc kubenswrapper[4797]: I0930 19:24:44.192209 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:25:14 crc kubenswrapper[4797]: I0930 19:25:14.191935 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:25:14 crc kubenswrapper[4797]: I0930 19:25:14.192740 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:25:18 crc kubenswrapper[4797]: I0930 19:25:18.890368 4797 scope.go:117] "RemoveContainer" containerID="c3831d27843532d14f3c2a620aee1d1d6e4b53d325f00d08ede2998740e2f31d" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.216417 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c5546bcbd-84n4q_9365f350-1fad-4ab1-a694-49912e391383/barbican-api/0.log" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.291106 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c5546bcbd-84n4q_9365f350-1fad-4ab1-a694-49912e391383/barbican-api-log/0.log" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.470627 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5768c5b854-k959d_9c9236f9-becb-4d5c-aeb5-56a3b0547c86/barbican-keystone-listener/0.log" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.538164 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5768c5b854-k959d_9c9236f9-becb-4d5c-aeb5-56a3b0547c86/barbican-keystone-listener-log/0.log" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.661243 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b75d759d5-6bwm5_080f211c-e410-4f16-af62-78ce0d6d9d26/barbican-worker/0.log" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.737288 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b75d759d5-6bwm5_080f211c-e410-4f16-af62-78ce0d6d9d26/barbican-worker-log/0.log" Sep 30 19:25:35 crc kubenswrapper[4797]: I0930 19:25:35.897892 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-v9fdg_76ed1105-fad0-4d4d-9039-06795b66a457/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.084297 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/ceilometer-central-agent/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.111662 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/ceilometer-notification-agent/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.179726 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/proxy-httpd/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.260293 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_239988d8-f0f2-49d2-95aa-2f50d3b1f5ce/sg-core/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.455801 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9f1efb9-4e3d-4371-bd43-55cffbe2d06d/cinder-api/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.469200 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b9f1efb9-4e3d-4371-bd43-55cffbe2d06d/cinder-api-log/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.663058 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6910128f-6ddf-4edf-86b6-a313f85db70d/probe/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.712384 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6910128f-6ddf-4edf-86b6-a313f85db70d/cinder-scheduler/0.log" Sep 30 19:25:36 crc kubenswrapper[4797]: I0930 19:25:36.859793 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-k9785_49ebc230-80f5-4bd0-a2fb-91cd9705a000/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.011065 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-g45vj_d44c9877-2212-4102-8f03-d2cf682cf7b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.167088 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vn7ct_994f3ed9-cd04-48c1-a7ab-f0c3d08b5858/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.294715 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-sc66n_28db4edb-04c5-44de-917b-8578fa6c4031/init/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.417630 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-sc66n_28db4edb-04c5-44de-917b-8578fa6c4031/init/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.635313 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-sc66n_28db4edb-04c5-44de-917b-8578fa6c4031/dnsmasq-dns/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.680940 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-fzj8h_deaf91d6-f89f-4acb-a4a1-0cf56a3a46a7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.867208 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7484ca2f-31fc-4ede-bdcc-2ce25e4d5023/glance-httpd/0.log" Sep 30 19:25:37 crc kubenswrapper[4797]: I0930 19:25:37.905026 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7484ca2f-31fc-4ede-bdcc-2ce25e4d5023/glance-log/0.log" Sep 30 19:25:38 crc kubenswrapper[4797]: I0930 19:25:38.081625 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2bd684cf-1443-4068-8b8f-7b1961474c80/glance-httpd/0.log" Sep 30 19:25:38 crc kubenswrapper[4797]: I0930 19:25:38.089729 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2bd684cf-1443-4068-8b8f-7b1961474c80/glance-log/0.log" Sep 30 19:25:38 crc kubenswrapper[4797]: I0930 19:25:38.438738 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6676d4ddcd-sxf6l_04e30fb7-7876-4a90-b887-05b7da2f7746/horizon/1.log" Sep 30 19:25:38 crc kubenswrapper[4797]: I0930 19:25:38.459279 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6676d4ddcd-sxf6l_04e30fb7-7876-4a90-b887-05b7da2f7746/horizon/0.log" Sep 30 19:25:38 crc kubenswrapper[4797]: I0930 19:25:38.699393 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-65dn7_047af11f-ac90-41df-96c5-be75581aff10/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:38 crc kubenswrapper[4797]: I0930 19:25:38.867557 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-k5jg9_2ab411d2-db8f-47ff-9233-739acad6d3ee/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:39 crc kubenswrapper[4797]: I0930 19:25:39.229219 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320981-sk9mv_90f2ee8c-9aaa-4cbc-bde8-25bc8a297045/keystone-cron/0.log" Sep 30 19:25:39 crc kubenswrapper[4797]: I0930 19:25:39.414949 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6676d4ddcd-sxf6l_04e30fb7-7876-4a90-b887-05b7da2f7746/horizon-log/0.log" Sep 30 19:25:39 crc kubenswrapper[4797]: I0930 19:25:39.462417 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1084da2e-fefb-4741-89c4-90257f878bf8/kube-state-metrics/0.log" Sep 30 19:25:39 crc kubenswrapper[4797]: I0930 19:25:39.577690 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b748fb867-znqws_af775f0c-a3ef-4bd7-bf2e-cecdacda03ff/keystone-api/0.log" Sep 30 19:25:39 crc kubenswrapper[4797]: I0930 19:25:39.705636 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vzb94_a008ddae-ddb5-47a3-9423-0da1ffdb8322/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:40 crc kubenswrapper[4797]: I0930 19:25:40.232465 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fc4d4d55c-fzms2_6726c7e0-d359-494e-9a9b-54c878d16e6b/neutron-httpd/0.log" Sep 30 19:25:40 crc kubenswrapper[4797]: I0930 19:25:40.265727 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fc4d4d55c-fzms2_6726c7e0-d359-494e-9a9b-54c878d16e6b/neutron-api/0.log" Sep 30 19:25:40 crc kubenswrapper[4797]: I0930 19:25:40.387701 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-tx67q_93bccc4f-33ad-45b9-9549-20ba5484888f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:41 crc kubenswrapper[4797]: I0930 19:25:41.317185 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_627d236c-592a-46d3-9ef3-5adc1749c0c9/nova-cell0-conductor-conductor/0.log" Sep 30 19:25:42 crc kubenswrapper[4797]: I0930 19:25:42.004207 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e0b40b7a-54a6-4fb4-868d-85f26823aeb3/nova-api-log/0.log" Sep 30 19:25:42 crc kubenswrapper[4797]: I0930 19:25:42.072529 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_549d2ac6-4a20-4698-ad97-b4a94dab16e0/nova-cell1-conductor-conductor/0.log" Sep 30 19:25:42 crc kubenswrapper[4797]: I0930 19:25:42.484660 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e16d4537-8c99-431f-bcd6-da24200f085b/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 19:25:42 crc kubenswrapper[4797]: I0930 19:25:42.513425 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e0b40b7a-54a6-4fb4-868d-85f26823aeb3/nova-api-api/0.log" Sep 30 19:25:42 crc kubenswrapper[4797]: I0930 19:25:42.834959 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-92w66_cd2701f6-0eb3-4359-a16d-7435179896c0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:42 crc kubenswrapper[4797]: I0930 19:25:42.857177 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4225b925-9d11-4b4b-8e2d-1063584ef26c/nova-metadata-log/0.log" Sep 30 19:25:43 crc kubenswrapper[4797]: I0930 19:25:43.465024 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fc0b01e6-a07d-4c6a-9bf5-1bada38ee89b/nova-scheduler-scheduler/0.log" Sep 30 19:25:43 crc kubenswrapper[4797]: I0930 19:25:43.489426 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9b64d922-5a17-4831-b87e-78ae0a9a9042/mysql-bootstrap/0.log" Sep 30 19:25:43 crc kubenswrapper[4797]: I0930 19:25:43.669863 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9b64d922-5a17-4831-b87e-78ae0a9a9042/mysql-bootstrap/0.log" Sep 30 19:25:43 crc kubenswrapper[4797]: I0930 19:25:43.737545 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9b64d922-5a17-4831-b87e-78ae0a9a9042/galera/0.log" Sep 30 19:25:43 crc kubenswrapper[4797]: I0930 19:25:43.944671 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c386be5e-6533-42b6-8a82-512c4c60cab2/mysql-bootstrap/0.log" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.165327 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c386be5e-6533-42b6-8a82-512c4c60cab2/mysql-bootstrap/0.log" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.192074 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.192367 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.192573 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.193702 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75bc536e3104e25f05f9c42c898aa9d45566d6737d743e239442ca221fb257b6"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.193870 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://75bc536e3104e25f05f9c42c898aa9d45566d6737d743e239442ca221fb257b6" gracePeriod=600 Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.255785 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c386be5e-6533-42b6-8a82-512c4c60cab2/galera/0.log" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.504567 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2a0d8448-cb6f-4fe4-9458-fad3bfd11471/openstackclient/0.log" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.712569 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8bw4c_828f5c5e-04c9-49c0-8056-7c930e756a44/openstack-network-exporter/0.log" Sep 30 19:25:44 crc kubenswrapper[4797]: I0930 19:25:44.872604 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nf4pk_4a527992-92f7-4aab-b8d4-e75ec72fd684/ovn-controller/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.197288 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovsdb-server-init/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.246385 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="75bc536e3104e25f05f9c42c898aa9d45566d6737d743e239442ca221fb257b6" exitCode=0 Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.246426 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"75bc536e3104e25f05f9c42c898aa9d45566d6737d743e239442ca221fb257b6"} Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.246511 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerStarted","Data":"487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6"} Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.246532 4797 scope.go:117] "RemoveContainer" containerID="52f3f3cda16eb49ccc889c2c287e8bad3813b684cdeba4225d914c6dc5a67cc0" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.417822 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovsdb-server-init/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.427723 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovs-vswitchd/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.530115 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4225b925-9d11-4b4b-8e2d-1063584ef26c/nova-metadata-metadata/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.601066 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dqkv4_2671b936-5121-4120-b39c-9686d92ed101/ovsdb-server/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.766874 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dmdz9_3d61414a-adba-4fcd-b3ca-417935b2c4db/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.852935 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3139ddf0-f590-40e2-bd15-0af615d5cbf1/openstack-network-exporter/0.log" Sep 30 19:25:45 crc kubenswrapper[4797]: I0930 19:25:45.972672 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3139ddf0-f590-40e2-bd15-0af615d5cbf1/ovn-northd/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.100038 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df911ba-9b38-46e5-b779-3db695c839a9/openstack-network-exporter/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.214599 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df911ba-9b38-46e5-b779-3db695c839a9/ovsdbserver-nb/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.347642 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3e42915-c1cf-479d-8cb1-d337a4407d64/openstack-network-exporter/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.444915 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3e42915-c1cf-479d-8cb1-d337a4407d64/ovsdbserver-sb/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.714030 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c47f4984-nxfz7_cd02f0fa-36e8-4676-802c-37127e022ad0/placement-api/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.888780 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c47f4984-nxfz7_cd02f0fa-36e8-4676-802c-37127e022ad0/placement-log/0.log" Sep 30 19:25:46 crc kubenswrapper[4797]: I0930 19:25:46.965200 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/init-config-reloader/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.144702 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/config-reloader/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.179989 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/prometheus/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.204406 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/init-config-reloader/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.355868 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3e430836-74b1-48cb-84bc-f623d27d6c93/thanos-sidecar/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.426445 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d/setup-container/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.656233 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d/setup-container/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.670197 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_399a6fbd-ad4b-4ff2-bb24-a2d1250c8c0d/rabbitmq/0.log" Sep 30 19:25:47 crc kubenswrapper[4797]: I0930 19:25:47.905777 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7ffa7d5-6c5e-4d12-beb4-beca118f83d5/setup-container/0.log" Sep 30 19:25:48 crc kubenswrapper[4797]: I0930 19:25:48.103396 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7ffa7d5-6c5e-4d12-beb4-beca118f83d5/rabbitmq/0.log" Sep 30 19:25:48 crc kubenswrapper[4797]: I0930 19:25:48.187618 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7ffa7d5-6c5e-4d12-beb4-beca118f83d5/setup-container/0.log" Sep 30 19:25:48 crc kubenswrapper[4797]: I0930 19:25:48.592564 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-4tsdb_1c7d707f-71bd-4194-b7c2-14a592f9772c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:48 crc kubenswrapper[4797]: I0930 19:25:48.626480 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-j5hr2_d8595051-9106-4b9a-bc5a-0a3e2e6ad11f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:48 crc kubenswrapper[4797]: I0930 19:25:48.806673 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jt9bh_bff66612-d0f8-4159-a096-478975f4d2e5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.010146 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-grthd_9b428893-75cc-423a-8ce8-31ccc2068037/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.178954 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rwq96_1976056c-3312-40a7-b5ae-f287c229e0a3/ssh-known-hosts-edpm-deployment/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.417647 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-594ff6944c-p2jp5_89ca411e-ead4-4a2d-9eba-f3f8fffcad46/proxy-server/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.569650 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-594ff6944c-p2jp5_89ca411e-ead4-4a2d-9eba-f3f8fffcad46/proxy-httpd/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.613178 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cx5sg_139c3278-4f30-418b-ae01-2ea9ac63ab55/swift-ring-rebalance/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.750342 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-auditor/0.log" Sep 30 19:25:49 crc kubenswrapper[4797]: I0930 19:25:49.795747 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-reaper/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.000501 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-replicator/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.045507 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/account-server/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.097675 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-auditor/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.236068 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-replicator/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.296199 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-server/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.297982 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/container-updater/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.461079 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-expirer/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.505909 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-auditor/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.588275 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-replicator/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.699055 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-server/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.705527 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/object-updater/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.822172 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/rsync/0.log" Sep 30 19:25:50 crc kubenswrapper[4797]: I0930 19:25:50.964154 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4fecceec-298d-4979-b468-5fe35c9b68e7/swift-recon-cron/0.log" Sep 30 19:25:51 crc kubenswrapper[4797]: I0930 19:25:51.165578 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9wtwm_287e8ba5-a33d-49ac-bd3f-b85dd5a401a3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:51 crc kubenswrapper[4797]: I0930 19:25:51.311004 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_05275916-b3be-4d53-8a06-ab3d5c8b3f7b/tempest-tests-tempest-tests-runner/0.log" Sep 30 19:25:51 crc kubenswrapper[4797]: I0930 19:25:51.499797 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2754f75a-653d-4ec9-8bac-e81f0353ec88/test-operator-logs-container/0.log" Sep 30 19:25:51 crc kubenswrapper[4797]: I0930 19:25:51.672736 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-fkswn_14617993-cbb9-43c8-9ec5-d3a3afb1bc19/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 19:25:52 crc kubenswrapper[4797]: I0930 19:25:52.874845 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_a04de7be-0f64-475b-8f90-5fb466645c02/watcher-applier/0.log" Sep 30 19:25:52 crc kubenswrapper[4797]: I0930 19:25:52.985520 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_84edad16-e218-42bd-bba3-77d16184436c/watcher-api-log/0.log" Sep 30 19:25:54 crc kubenswrapper[4797]: I0930 19:25:54.799030 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_103ee950-a749-41ce-be1e-bdfb715bc7ad/watcher-decision-engine/0.log" Sep 30 19:25:57 crc kubenswrapper[4797]: I0930 19:25:57.324365 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_84edad16-e218-42bd-bba3-77d16184436c/watcher-api/0.log" Sep 30 19:25:58 crc kubenswrapper[4797]: I0930 19:25:58.482265 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6e728b8d-50b8-43df-bb8c-e3cbfce614e9/memcached/0.log" Sep 30 19:26:25 crc kubenswrapper[4797]: I0930 19:26:25.730290 4797 generic.go:334] "Generic (PLEG): container finished" podID="df1a253f-8d96-4437-953b-0c4285227c36" containerID="4061ebdca6566b378517452e2a94ba2954e63cf30ed7665a63605af65acb90d0" exitCode=0 Sep 30 19:26:25 crc kubenswrapper[4797]: I0930 19:26:25.731273 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" event={"ID":"df1a253f-8d96-4437-953b-0c4285227c36","Type":"ContainerDied","Data":"4061ebdca6566b378517452e2a94ba2954e63cf30ed7665a63605af65acb90d0"} Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.853477 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.901021 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-zdp6x"] Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.910865 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-zdp6x"] Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.922990 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck27v\" (UniqueName: \"kubernetes.io/projected/df1a253f-8d96-4437-953b-0c4285227c36-kube-api-access-ck27v\") pod \"df1a253f-8d96-4437-953b-0c4285227c36\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.923444 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df1a253f-8d96-4437-953b-0c4285227c36-host\") pod \"df1a253f-8d96-4437-953b-0c4285227c36\" (UID: \"df1a253f-8d96-4437-953b-0c4285227c36\") " Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.924088 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1a253f-8d96-4437-953b-0c4285227c36-host" (OuterVolumeSpecName: "host") pod "df1a253f-8d96-4437-953b-0c4285227c36" (UID: "df1a253f-8d96-4437-953b-0c4285227c36"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:26:26 crc kubenswrapper[4797]: I0930 19:26:26.929724 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1a253f-8d96-4437-953b-0c4285227c36-kube-api-access-ck27v" (OuterVolumeSpecName: "kube-api-access-ck27v") pod "df1a253f-8d96-4437-953b-0c4285227c36" (UID: "df1a253f-8d96-4437-953b-0c4285227c36"). InnerVolumeSpecName "kube-api-access-ck27v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:26:27 crc kubenswrapper[4797]: I0930 19:26:27.025946 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df1a253f-8d96-4437-953b-0c4285227c36-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:26:27 crc kubenswrapper[4797]: I0930 19:26:27.025993 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck27v\" (UniqueName: \"kubernetes.io/projected/df1a253f-8d96-4437-953b-0c4285227c36-kube-api-access-ck27v\") on node \"crc\" DevicePath \"\"" Sep 30 19:26:27 crc kubenswrapper[4797]: I0930 19:26:27.760008 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4e913100c2432817568d626500a16c33052b60064319b9afb32723dbdac37b" Sep 30 19:26:27 crc kubenswrapper[4797]: I0930 19:26:27.760471 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-zdp6x" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.139402 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-9md5t"] Sep 30 19:26:28 crc kubenswrapper[4797]: E0930 19:26:28.140151 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1a253f-8d96-4437-953b-0c4285227c36" containerName="container-00" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.140168 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1a253f-8d96-4437-953b-0c4285227c36" containerName="container-00" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.140476 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1a253f-8d96-4437-953b-0c4285227c36" containerName="container-00" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.141538 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.145579 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c5hqg"/"default-dockercfg-hv5jv" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.252546 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df1a253f-8d96-4437-953b-0c4285227c36" path="/var/lib/kubelet/pods/df1a253f-8d96-4437-953b-0c4285227c36/volumes" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.253184 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgsbp\" (UniqueName: \"kubernetes.io/projected/23fe316c-0e2e-4ef4-99e9-17684cfa873f-kube-api-access-tgsbp\") pod \"crc-debug-9md5t\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.253356 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fe316c-0e2e-4ef4-99e9-17684cfa873f-host\") pod \"crc-debug-9md5t\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.355715 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgsbp\" (UniqueName: \"kubernetes.io/projected/23fe316c-0e2e-4ef4-99e9-17684cfa873f-kube-api-access-tgsbp\") pod \"crc-debug-9md5t\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.355801 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fe316c-0e2e-4ef4-99e9-17684cfa873f-host\") pod \"crc-debug-9md5t\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.356525 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fe316c-0e2e-4ef4-99e9-17684cfa873f-host\") pod \"crc-debug-9md5t\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.379460 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgsbp\" (UniqueName: \"kubernetes.io/projected/23fe316c-0e2e-4ef4-99e9-17684cfa873f-kube-api-access-tgsbp\") pod \"crc-debug-9md5t\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.468667 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:28 crc kubenswrapper[4797]: I0930 19:26:28.774024 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" event={"ID":"23fe316c-0e2e-4ef4-99e9-17684cfa873f","Type":"ContainerStarted","Data":"4a8b9ee92840d4d1c43ddecd4d171c7659fdae9d98ce4782d3d3373e842a1cb5"} Sep 30 19:26:29 crc kubenswrapper[4797]: I0930 19:26:29.785116 4797 generic.go:334] "Generic (PLEG): container finished" podID="23fe316c-0e2e-4ef4-99e9-17684cfa873f" containerID="4007f7b3c42427088db16135dbc13fd53f63881c166957191cd34738eea5fee4" exitCode=0 Sep 30 19:26:29 crc kubenswrapper[4797]: I0930 19:26:29.785202 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" event={"ID":"23fe316c-0e2e-4ef4-99e9-17684cfa873f","Type":"ContainerDied","Data":"4007f7b3c42427088db16135dbc13fd53f63881c166957191cd34738eea5fee4"} Sep 30 19:26:30 crc kubenswrapper[4797]: I0930 19:26:30.898655 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.017235 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgsbp\" (UniqueName: \"kubernetes.io/projected/23fe316c-0e2e-4ef4-99e9-17684cfa873f-kube-api-access-tgsbp\") pod \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.017723 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fe316c-0e2e-4ef4-99e9-17684cfa873f-host\") pod \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\" (UID: \"23fe316c-0e2e-4ef4-99e9-17684cfa873f\") " Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.017927 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23fe316c-0e2e-4ef4-99e9-17684cfa873f-host" (OuterVolumeSpecName: "host") pod "23fe316c-0e2e-4ef4-99e9-17684cfa873f" (UID: "23fe316c-0e2e-4ef4-99e9-17684cfa873f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.018185 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23fe316c-0e2e-4ef4-99e9-17684cfa873f-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.023151 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fe316c-0e2e-4ef4-99e9-17684cfa873f-kube-api-access-tgsbp" (OuterVolumeSpecName: "kube-api-access-tgsbp") pod "23fe316c-0e2e-4ef4-99e9-17684cfa873f" (UID: "23fe316c-0e2e-4ef4-99e9-17684cfa873f"). InnerVolumeSpecName "kube-api-access-tgsbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.119376 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgsbp\" (UniqueName: \"kubernetes.io/projected/23fe316c-0e2e-4ef4-99e9-17684cfa873f-kube-api-access-tgsbp\") on node \"crc\" DevicePath \"\"" Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.804393 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" event={"ID":"23fe316c-0e2e-4ef4-99e9-17684cfa873f","Type":"ContainerDied","Data":"4a8b9ee92840d4d1c43ddecd4d171c7659fdae9d98ce4782d3d3373e842a1cb5"} Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.804471 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8b9ee92840d4d1c43ddecd4d171c7659fdae9d98ce4782d3d3373e842a1cb5" Sep 30 19:26:31 crc kubenswrapper[4797]: I0930 19:26:31.804527 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-9md5t" Sep 30 19:26:40 crc kubenswrapper[4797]: I0930 19:26:40.219707 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-9md5t"] Sep 30 19:26:40 crc kubenswrapper[4797]: I0930 19:26:40.228724 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-9md5t"] Sep 30 19:26:40 crc kubenswrapper[4797]: I0930 19:26:40.250979 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fe316c-0e2e-4ef4-99e9-17684cfa873f" path="/var/lib/kubelet/pods/23fe316c-0e2e-4ef4-99e9-17684cfa873f/volumes" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.413795 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-c2rx5"] Sep 30 19:26:41 crc kubenswrapper[4797]: E0930 19:26:41.414476 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fe316c-0e2e-4ef4-99e9-17684cfa873f" containerName="container-00" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.414488 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fe316c-0e2e-4ef4-99e9-17684cfa873f" containerName="container-00" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.414717 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fe316c-0e2e-4ef4-99e9-17684cfa873f" containerName="container-00" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.415368 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.417893 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c5hqg"/"default-dockercfg-hv5jv" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.514588 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsdfb\" (UniqueName: \"kubernetes.io/projected/2f83f53b-c063-4c9e-a37e-a36f4540b859-kube-api-access-tsdfb\") pod \"crc-debug-c2rx5\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.514688 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f83f53b-c063-4c9e-a37e-a36f4540b859-host\") pod \"crc-debug-c2rx5\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.617679 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsdfb\" (UniqueName: \"kubernetes.io/projected/2f83f53b-c063-4c9e-a37e-a36f4540b859-kube-api-access-tsdfb\") pod \"crc-debug-c2rx5\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.617781 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f83f53b-c063-4c9e-a37e-a36f4540b859-host\") pod \"crc-debug-c2rx5\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.617952 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f83f53b-c063-4c9e-a37e-a36f4540b859-host\") pod \"crc-debug-c2rx5\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.661462 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsdfb\" (UniqueName: \"kubernetes.io/projected/2f83f53b-c063-4c9e-a37e-a36f4540b859-kube-api-access-tsdfb\") pod \"crc-debug-c2rx5\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.746148 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:41 crc kubenswrapper[4797]: I0930 19:26:41.891814 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" event={"ID":"2f83f53b-c063-4c9e-a37e-a36f4540b859","Type":"ContainerStarted","Data":"fdd0ce7ca62e3ac1816ae8b66a6bec8cf6b39014b5af6a2e3f139139a4b3eb29"} Sep 30 19:26:42 crc kubenswrapper[4797]: I0930 19:26:42.904780 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" event={"ID":"2f83f53b-c063-4c9e-a37e-a36f4540b859","Type":"ContainerDied","Data":"c5ca62fb7efe9bf3490d5b6eca1b141af00b605dede9c8ea13f9d1164b8d2100"} Sep 30 19:26:42 crc kubenswrapper[4797]: I0930 19:26:42.904625 4797 generic.go:334] "Generic (PLEG): container finished" podID="2f83f53b-c063-4c9e-a37e-a36f4540b859" containerID="c5ca62fb7efe9bf3490d5b6eca1b141af00b605dede9c8ea13f9d1164b8d2100" exitCode=0 Sep 30 19:26:43 crc kubenswrapper[4797]: I0930 19:26:43.089225 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-c2rx5"] Sep 30 19:26:43 crc kubenswrapper[4797]: I0930 19:26:43.100725 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5hqg/crc-debug-c2rx5"] Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.054265 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.173903 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f83f53b-c063-4c9e-a37e-a36f4540b859-host\") pod \"2f83f53b-c063-4c9e-a37e-a36f4540b859\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.173970 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsdfb\" (UniqueName: \"kubernetes.io/projected/2f83f53b-c063-4c9e-a37e-a36f4540b859-kube-api-access-tsdfb\") pod \"2f83f53b-c063-4c9e-a37e-a36f4540b859\" (UID: \"2f83f53b-c063-4c9e-a37e-a36f4540b859\") " Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.174061 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f83f53b-c063-4c9e-a37e-a36f4540b859-host" (OuterVolumeSpecName: "host") pod "2f83f53b-c063-4c9e-a37e-a36f4540b859" (UID: "2f83f53b-c063-4c9e-a37e-a36f4540b859"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.174387 4797 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f83f53b-c063-4c9e-a37e-a36f4540b859-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.189693 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f83f53b-c063-4c9e-a37e-a36f4540b859-kube-api-access-tsdfb" (OuterVolumeSpecName: "kube-api-access-tsdfb") pod "2f83f53b-c063-4c9e-a37e-a36f4540b859" (UID: "2f83f53b-c063-4c9e-a37e-a36f4540b859"). InnerVolumeSpecName "kube-api-access-tsdfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.248876 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f83f53b-c063-4c9e-a37e-a36f4540b859" path="/var/lib/kubelet/pods/2f83f53b-c063-4c9e-a37e-a36f4540b859/volumes" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.276183 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsdfb\" (UniqueName: \"kubernetes.io/projected/2f83f53b-c063-4c9e-a37e-a36f4540b859-kube-api-access-tsdfb\") on node \"crc\" DevicePath \"\"" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.924573 4797 scope.go:117] "RemoveContainer" containerID="c5ca62fb7efe9bf3490d5b6eca1b141af00b605dede9c8ea13f9d1164b8d2100" Sep 30 19:26:44 crc kubenswrapper[4797]: I0930 19:26:44.924628 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/crc-debug-c2rx5" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.159955 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/util/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.339931 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/util/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.386638 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/pull/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.394992 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/pull/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.642013 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/util/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.642480 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/pull/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.685613 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_32696112cd108f810e92200d76b5367dd09d983c683f8a54ad13dc2a2asnwj6_7746f536-ef36-41bd-9f28-94c03952ffde/extract/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.859459 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-csf68_3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa/kube-rbac-proxy/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.914173 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-csf68_3f3ccfc4-f89f-43c2-9bbd-51bbfffa62fa/manager/0.log" Sep 30 19:26:45 crc kubenswrapper[4797]: I0930 19:26:45.950390 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-wvthj_481318fa-263c-4a4b-b775-879776670ddb/kube-rbac-proxy/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.082592 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-wvthj_481318fa-263c-4a4b-b775-879776670ddb/manager/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.117651 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-85wtx_89b215ec-763f-4eb9-aef0-7f5b1d43481d/kube-rbac-proxy/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.143481 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-85wtx_89b215ec-763f-4eb9-aef0-7f5b1d43481d/manager/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.314785 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-585mg_72a082c8-41b8-4666-bdd1-8f998dedc4c3/kube-rbac-proxy/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.368596 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-585mg_72a082c8-41b8-4666-bdd1-8f998dedc4c3/manager/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.523688 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h2z55_c3c39950-97e6-423c-8884-b65548f38830/manager/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.538090 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-d2mm8_07a9edee-b2ec-48d8-85b3-191f2f29bf73/kube-rbac-proxy/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.545666 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h2z55_c3c39950-97e6-423c-8884-b65548f38830/kube-rbac-proxy/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.701324 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-9zcc2_1c601f06-9978-4f2b-8f37-2fa1bef8e8dd/kube-rbac-proxy/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.756533 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-d2mm8_07a9edee-b2ec-48d8-85b3-191f2f29bf73/manager/0.log" Sep 30 19:26:46 crc kubenswrapper[4797]: I0930 19:26:46.912447 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-9zcc2_1c601f06-9978-4f2b-8f37-2fa1bef8e8dd/manager/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.111786 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4wpww_9f9e430f-f1af-46a5-9885-2e25473d376d/kube-rbac-proxy/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.135393 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4wpww_9f9e430f-f1af-46a5-9885-2e25473d376d/manager/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.263577 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-slc6t_8824a3d0-28dc-42eb-b767-b9425f556076/kube-rbac-proxy/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.374187 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-slc6t_8824a3d0-28dc-42eb-b767-b9425f556076/manager/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.493902 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-vlc4g_24d8b2e3-5124-4bac-8cb1-871daabad7e6/kube-rbac-proxy/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.528940 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-vlc4g_24d8b2e3-5124-4bac-8cb1-871daabad7e6/manager/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.565279 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n564q_65d324d6-26a4-4a59-a29d-a92cad26a07a/kube-rbac-proxy/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.708085 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n564q_65d324d6-26a4-4a59-a29d-a92cad26a07a/manager/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.747790 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-qlrv8_9508eece-17ea-4b43-9bdb-6c2f8da6e21f/kube-rbac-proxy/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.839600 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-qlrv8_9508eece-17ea-4b43-9bdb-6c2f8da6e21f/manager/0.log" Sep 30 19:26:47 crc kubenswrapper[4797]: I0930 19:26:47.930159 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-d7r9l_04f5a5c7-892a-4aa5-8e21-ec847d9e29fb/kube-rbac-proxy/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.042293 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-d7r9l_04f5a5c7-892a-4aa5-8e21-ec847d9e29fb/manager/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.154703 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-c58zh_3195fff1-53f5-491a-869b-0f7fc5e45df6/kube-rbac-proxy/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.201683 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-c58zh_3195fff1-53f5-491a-869b-0f7fc5e45df6/manager/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.373844 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-zj2w6_d69ffa93-8979-4922-8aee-7ea26fede6b4/manager/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.401180 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-zj2w6_d69ffa93-8979-4922-8aee-7ea26fede6b4/kube-rbac-proxy/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.458293 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dc7c668c-46p2f_a618912d-66f8-4486-8e69-d3dc16f3cb34/kube-rbac-proxy/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.619021 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5c6649c9b9-x9zfh_3192f5d1-b6e4-4471-9adb-24c613a970f4/kube-rbac-proxy/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.835282 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5c6649c9b9-x9zfh_3192f5d1-b6e4-4471-9adb-24c613a970f4/operator/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.876104 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qkts8_43506dea-1282-457a-9201-a2c9f9baa6f3/registry-server/0.log" Sep 30 19:26:48 crc kubenswrapper[4797]: I0930 19:26:48.938320 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-dbxcb_05a4965a-f8d6-4859-ab5d-87773f6f6981/kube-rbac-proxy/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.071545 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-xh5rb_8a1ceaa0-b6e6-442d-84d0-3fba075b136c/kube-rbac-proxy/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.106368 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-dbxcb_05a4965a-f8d6-4859-ab5d-87773f6f6981/manager/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.229624 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-xh5rb_8a1ceaa0-b6e6-442d-84d0-3fba075b136c/manager/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.312911 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-gcwrs_ca63e090-a37a-4150-9c58-edf133c74c99/operator/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.528929 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-grp8f_c9c41380-c9ee-4467-b343-0f6cf78d51bc/kube-rbac-proxy/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.613929 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dc7c668c-46p2f_a618912d-66f8-4486-8e69-d3dc16f3cb34/manager/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.638180 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-grp8f_c9c41380-c9ee-4467-b343-0f6cf78d51bc/manager/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.694317 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5vhf6_ff0b391c-ac01-4a17-9381-a1e2b00d044d/kube-rbac-proxy/0.log" Sep 30 19:26:49 crc kubenswrapper[4797]: I0930 19:26:49.879922 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-66ff6_be859676-32a8-4144-94fd-ab0da94ce6bc/kube-rbac-proxy/0.log" Sep 30 19:26:50 crc kubenswrapper[4797]: I0930 19:26:50.007352 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-66ff6_be859676-32a8-4144-94fd-ab0da94ce6bc/manager/0.log" Sep 30 19:26:50 crc kubenswrapper[4797]: I0930 19:26:50.014526 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5vhf6_ff0b391c-ac01-4a17-9381-a1e2b00d044d/manager/0.log" Sep 30 19:26:50 crc kubenswrapper[4797]: I0930 19:26:50.043244 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-598db9dcc9-jbsh8_508a8f28-1d71-43ac-b24b-65f226abf807/kube-rbac-proxy/0.log" Sep 30 19:26:50 crc kubenswrapper[4797]: I0930 19:26:50.163637 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-598db9dcc9-jbsh8_508a8f28-1d71-43ac-b24b-65f226abf807/manager/0.log" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.658763 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kj2n5"] Sep 30 19:26:54 crc kubenswrapper[4797]: E0930 19:26:54.659898 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f83f53b-c063-4c9e-a37e-a36f4540b859" containerName="container-00" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.659911 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f83f53b-c063-4c9e-a37e-a36f4540b859" containerName="container-00" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.660127 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f83f53b-c063-4c9e-a37e-a36f4540b859" containerName="container-00" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.661515 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.672526 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kj2n5"] Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.682122 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hmr\" (UniqueName: \"kubernetes.io/projected/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-kube-api-access-f2hmr\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.682221 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-utilities\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.682253 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-catalog-content\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.784366 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hmr\" (UniqueName: \"kubernetes.io/projected/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-kube-api-access-f2hmr\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.784503 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-utilities\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.784537 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-catalog-content\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.785017 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-utilities\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.785233 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-catalog-content\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:54 crc kubenswrapper[4797]: I0930 19:26:54.803725 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hmr\" (UniqueName: \"kubernetes.io/projected/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-kube-api-access-f2hmr\") pod \"certified-operators-kj2n5\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:55 crc kubenswrapper[4797]: I0930 19:26:55.038189 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:26:55 crc kubenswrapper[4797]: I0930 19:26:55.635243 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kj2n5"] Sep 30 19:26:56 crc kubenswrapper[4797]: I0930 19:26:56.021447 4797 generic.go:334] "Generic (PLEG): container finished" podID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerID="7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882" exitCode=0 Sep 30 19:26:56 crc kubenswrapper[4797]: I0930 19:26:56.021506 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerDied","Data":"7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882"} Sep 30 19:26:56 crc kubenswrapper[4797]: I0930 19:26:56.021688 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerStarted","Data":"41a82aafc2d8c21b249a7236652dd4fb32c0f64b8dd22abd8bd5c73b53785143"} Sep 30 19:26:56 crc kubenswrapper[4797]: I0930 19:26:56.023783 4797 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:26:58 crc kubenswrapper[4797]: I0930 19:26:58.042645 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerStarted","Data":"7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11"} Sep 30 19:26:59 crc kubenswrapper[4797]: I0930 19:26:59.053569 4797 generic.go:334] "Generic (PLEG): container finished" podID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerID="7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11" exitCode=0 Sep 30 19:26:59 crc kubenswrapper[4797]: I0930 19:26:59.053629 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerDied","Data":"7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11"} Sep 30 19:27:00 crc kubenswrapper[4797]: I0930 19:27:00.066251 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerStarted","Data":"6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0"} Sep 30 19:27:05 crc kubenswrapper[4797]: I0930 19:27:05.038968 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:27:05 crc kubenswrapper[4797]: I0930 19:27:05.039489 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:27:05 crc kubenswrapper[4797]: I0930 19:27:05.110274 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:27:05 crc kubenswrapper[4797]: I0930 19:27:05.134452 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kj2n5" podStartSLOduration=7.467593998 podStartE2EDuration="11.134408802s" podCreationTimestamp="2025-09-30 19:26:54 +0000 UTC" firstStartedPulling="2025-09-30 19:26:56.023506053 +0000 UTC m=+6266.546005291" lastFinishedPulling="2025-09-30 19:26:59.690320847 +0000 UTC m=+6270.212820095" observedRunningTime="2025-09-30 19:27:00.08349743 +0000 UTC m=+6270.605996668" watchObservedRunningTime="2025-09-30 19:27:05.134408802 +0000 UTC m=+6275.656908040" Sep 30 19:27:05 crc kubenswrapper[4797]: I0930 19:27:05.172738 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:27:05 crc kubenswrapper[4797]: I0930 19:27:05.354497 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kj2n5"] Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.141830 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kj2n5" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="registry-server" containerID="cri-o://6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0" gracePeriod=2 Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.635588 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.688154 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-catalog-content\") pod \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.688741 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-utilities\") pod \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.688774 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2hmr\" (UniqueName: \"kubernetes.io/projected/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-kube-api-access-f2hmr\") pod \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\" (UID: \"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd\") " Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.690704 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-utilities" (OuterVolumeSpecName: "utilities") pod "45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" (UID: "45f19fab-5c85-4a8c-b6a0-adb2a694a1cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.697628 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-kube-api-access-f2hmr" (OuterVolumeSpecName: "kube-api-access-f2hmr") pod "45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" (UID: "45f19fab-5c85-4a8c-b6a0-adb2a694a1cd"). InnerVolumeSpecName "kube-api-access-f2hmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.775248 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" (UID: "45f19fab-5c85-4a8c-b6a0-adb2a694a1cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.791147 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.791196 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.791212 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2hmr\" (UniqueName: \"kubernetes.io/projected/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd-kube-api-access-f2hmr\") on node \"crc\" DevicePath \"\"" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.859004 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ndhrr_1595bfb1-9c13-4148-a2ae-075a0fb0e05b/control-plane-machine-set-operator/0.log" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.970226 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8p4lg_c0adfd2b-83f6-41ea-beee-cd0a5ac3973b/kube-rbac-proxy/0.log" Sep 30 19:27:07 crc kubenswrapper[4797]: I0930 19:27:07.996529 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8p4lg_c0adfd2b-83f6-41ea-beee-cd0a5ac3973b/machine-api-operator/0.log" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.176941 4797 generic.go:334] "Generic (PLEG): container finished" podID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerID="6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0" exitCode=0 Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.176999 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerDied","Data":"6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0"} Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.177031 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kj2n5" event={"ID":"45f19fab-5c85-4a8c-b6a0-adb2a694a1cd","Type":"ContainerDied","Data":"41a82aafc2d8c21b249a7236652dd4fb32c0f64b8dd22abd8bd5c73b53785143"} Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.177055 4797 scope.go:117] "RemoveContainer" containerID="6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.177258 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kj2n5" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.214602 4797 scope.go:117] "RemoveContainer" containerID="7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.217825 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kj2n5"] Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.227882 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kj2n5"] Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.251643 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" path="/var/lib/kubelet/pods/45f19fab-5c85-4a8c-b6a0-adb2a694a1cd/volumes" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.269938 4797 scope.go:117] "RemoveContainer" containerID="7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.298771 4797 scope.go:117] "RemoveContainer" containerID="6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0" Sep 30 19:27:08 crc kubenswrapper[4797]: E0930 19:27:08.299563 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0\": container with ID starting with 6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0 not found: ID does not exist" containerID="6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.299604 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0"} err="failed to get container status \"6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0\": rpc error: code = NotFound desc = could not find container \"6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0\": container with ID starting with 6c2b09df7563d260feda0bef94343f39f4774622a2629355866dea0b8fcea4c0 not found: ID does not exist" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.299631 4797 scope.go:117] "RemoveContainer" containerID="7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11" Sep 30 19:27:08 crc kubenswrapper[4797]: E0930 19:27:08.300107 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11\": container with ID starting with 7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11 not found: ID does not exist" containerID="7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.300156 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11"} err="failed to get container status \"7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11\": rpc error: code = NotFound desc = could not find container \"7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11\": container with ID starting with 7cb12879e41d7c192a1e1ba92df74870209e4687326df1e1fdae3adfff299b11 not found: ID does not exist" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.300186 4797 scope.go:117] "RemoveContainer" containerID="7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882" Sep 30 19:27:08 crc kubenswrapper[4797]: E0930 19:27:08.300491 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882\": container with ID starting with 7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882 not found: ID does not exist" containerID="7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882" Sep 30 19:27:08 crc kubenswrapper[4797]: I0930 19:27:08.300519 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882"} err="failed to get container status \"7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882\": rpc error: code = NotFound desc = could not find container \"7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882\": container with ID starting with 7e73a798c6d00788282eacb0f0f784ca47094e23dfeb3375d0b9e7a6404d6882 not found: ID does not exist" Sep 30 19:27:20 crc kubenswrapper[4797]: I0930 19:27:20.682742 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2wj9r_9d1c6f31-57b5-4629-b6aa-abcf3394a4f4/cert-manager-controller/0.log" Sep 30 19:27:20 crc kubenswrapper[4797]: I0930 19:27:20.788230 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qj8wf_696a605e-f3af-4acb-941d-22aa927ba890/cert-manager-cainjector/0.log" Sep 30 19:27:20 crc kubenswrapper[4797]: I0930 19:27:20.890472 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-77l4k_ddd09658-492a-4954-8d85-7c05dbe4b5c4/cert-manager-webhook/0.log" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.831560 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bn2"] Sep 30 19:27:28 crc kubenswrapper[4797]: E0930 19:27:28.832609 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="extract-utilities" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.832629 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="extract-utilities" Sep 30 19:27:28 crc kubenswrapper[4797]: E0930 19:27:28.832645 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="registry-server" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.832652 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="registry-server" Sep 30 19:27:28 crc kubenswrapper[4797]: E0930 19:27:28.832692 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="extract-content" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.832700 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="extract-content" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.832946 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f19fab-5c85-4a8c-b6a0-adb2a694a1cd" containerName="registry-server" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.835456 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.861895 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bn2"] Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.950104 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-catalog-content\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.950901 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-utilities\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:28 crc kubenswrapper[4797]: I0930 19:27:28.951029 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp8zr\" (UniqueName: \"kubernetes.io/projected/395ac072-f000-4cab-b948-e7838f7387fe-kube-api-access-dp8zr\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.053088 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-catalog-content\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.053171 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-utilities\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.053224 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp8zr\" (UniqueName: \"kubernetes.io/projected/395ac072-f000-4cab-b948-e7838f7387fe-kube-api-access-dp8zr\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.053681 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-utilities\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.053681 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-catalog-content\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.074905 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp8zr\" (UniqueName: \"kubernetes.io/projected/395ac072-f000-4cab-b948-e7838f7387fe-kube-api-access-dp8zr\") pod \"redhat-marketplace-x5bn2\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.156716 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:29 crc kubenswrapper[4797]: I0930 19:27:29.615628 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bn2"] Sep 30 19:27:30 crc kubenswrapper[4797]: I0930 19:27:30.402197 4797 generic.go:334] "Generic (PLEG): container finished" podID="395ac072-f000-4cab-b948-e7838f7387fe" containerID="05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6" exitCode=0 Sep 30 19:27:30 crc kubenswrapper[4797]: I0930 19:27:30.402306 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerDied","Data":"05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6"} Sep 30 19:27:30 crc kubenswrapper[4797]: I0930 19:27:30.402597 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerStarted","Data":"44e0b8d6c3964b2fce3efc2092e10d55caa370d01a9070e39d78f8fd8a2a7b43"} Sep 30 19:27:31 crc kubenswrapper[4797]: I0930 19:27:31.424887 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerStarted","Data":"3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb"} Sep 30 19:27:32 crc kubenswrapper[4797]: I0930 19:27:32.436180 4797 generic.go:334] "Generic (PLEG): container finished" podID="395ac072-f000-4cab-b948-e7838f7387fe" containerID="3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb" exitCode=0 Sep 30 19:27:32 crc kubenswrapper[4797]: I0930 19:27:32.436379 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerDied","Data":"3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb"} Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.116687 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-g9n7q_aeadbff5-abfe-4f8c-a24f-e62db0f23612/nmstate-console-plugin/0.log" Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.331125 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-x2tc2_ccc6478c-07c2-431f-a964-1db62dd3800e/nmstate-handler/0.log" Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.443269 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-4nbj6_5e6982a1-49c0-428e-8b68-38899f1be907/kube-rbac-proxy/0.log" Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.461850 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-4nbj6_5e6982a1-49c0-428e-8b68-38899f1be907/nmstate-metrics/0.log" Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.470819 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerStarted","Data":"2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc"} Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.504197 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x5bn2" podStartSLOduration=2.999072614 podStartE2EDuration="5.504166857s" podCreationTimestamp="2025-09-30 19:27:28 +0000 UTC" firstStartedPulling="2025-09-30 19:27:30.404176817 +0000 UTC m=+6300.926676045" lastFinishedPulling="2025-09-30 19:27:32.90927105 +0000 UTC m=+6303.431770288" observedRunningTime="2025-09-30 19:27:33.486043098 +0000 UTC m=+6304.008542336" watchObservedRunningTime="2025-09-30 19:27:33.504166857 +0000 UTC m=+6304.026666095" Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.661166 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-d2f5j_4ce57370-6b4e-4a2a-be84-6cea546156ac/nmstate-operator/0.log" Sep 30 19:27:33 crc kubenswrapper[4797]: I0930 19:27:33.738348 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-bxqvs_3417981c-8b7a-48f5-b504-c3a358706f7f/nmstate-webhook/0.log" Sep 30 19:27:39 crc kubenswrapper[4797]: I0930 19:27:39.156931 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:39 crc kubenswrapper[4797]: I0930 19:27:39.157566 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:39 crc kubenswrapper[4797]: I0930 19:27:39.213391 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:39 crc kubenswrapper[4797]: I0930 19:27:39.583403 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:39 crc kubenswrapper[4797]: I0930 19:27:39.651598 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bn2"] Sep 30 19:27:41 crc kubenswrapper[4797]: I0930 19:27:41.553785 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x5bn2" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="registry-server" containerID="cri-o://2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc" gracePeriod=2 Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.037238 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.216244 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp8zr\" (UniqueName: \"kubernetes.io/projected/395ac072-f000-4cab-b948-e7838f7387fe-kube-api-access-dp8zr\") pod \"395ac072-f000-4cab-b948-e7838f7387fe\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.216491 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-utilities\") pod \"395ac072-f000-4cab-b948-e7838f7387fe\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.216606 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-catalog-content\") pod \"395ac072-f000-4cab-b948-e7838f7387fe\" (UID: \"395ac072-f000-4cab-b948-e7838f7387fe\") " Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.217274 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-utilities" (OuterVolumeSpecName: "utilities") pod "395ac072-f000-4cab-b948-e7838f7387fe" (UID: "395ac072-f000-4cab-b948-e7838f7387fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.223745 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395ac072-f000-4cab-b948-e7838f7387fe-kube-api-access-dp8zr" (OuterVolumeSpecName: "kube-api-access-dp8zr") pod "395ac072-f000-4cab-b948-e7838f7387fe" (UID: "395ac072-f000-4cab-b948-e7838f7387fe"). InnerVolumeSpecName "kube-api-access-dp8zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.229206 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "395ac072-f000-4cab-b948-e7838f7387fe" (UID: "395ac072-f000-4cab-b948-e7838f7387fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.319507 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.319554 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ac072-f000-4cab-b948-e7838f7387fe-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.319569 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp8zr\" (UniqueName: \"kubernetes.io/projected/395ac072-f000-4cab-b948-e7838f7387fe-kube-api-access-dp8zr\") on node \"crc\" DevicePath \"\"" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.578301 4797 generic.go:334] "Generic (PLEG): container finished" podID="395ac072-f000-4cab-b948-e7838f7387fe" containerID="2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc" exitCode=0 Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.578359 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerDied","Data":"2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc"} Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.578390 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bn2" event={"ID":"395ac072-f000-4cab-b948-e7838f7387fe","Type":"ContainerDied","Data":"44e0b8d6c3964b2fce3efc2092e10d55caa370d01a9070e39d78f8fd8a2a7b43"} Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.578405 4797 scope.go:117] "RemoveContainer" containerID="2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.578590 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bn2" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.611202 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bn2"] Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.617265 4797 scope.go:117] "RemoveContainer" containerID="3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.619215 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bn2"] Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.642225 4797 scope.go:117] "RemoveContainer" containerID="05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.708302 4797 scope.go:117] "RemoveContainer" containerID="2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc" Sep 30 19:27:42 crc kubenswrapper[4797]: E0930 19:27:42.708776 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc\": container with ID starting with 2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc not found: ID does not exist" containerID="2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.708832 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc"} err="failed to get container status \"2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc\": rpc error: code = NotFound desc = could not find container \"2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc\": container with ID starting with 2fa1a59d8585d7634a65f1d3a134be6c59c9ae15648ed78098f605c670bd39dc not found: ID does not exist" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.708870 4797 scope.go:117] "RemoveContainer" containerID="3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb" Sep 30 19:27:42 crc kubenswrapper[4797]: E0930 19:27:42.709229 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb\": container with ID starting with 3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb not found: ID does not exist" containerID="3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.709261 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb"} err="failed to get container status \"3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb\": rpc error: code = NotFound desc = could not find container \"3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb\": container with ID starting with 3813836ade167e6cbe5a1c06b56fc241c781185680ddbed482d62ca1af86febb not found: ID does not exist" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.709285 4797 scope.go:117] "RemoveContainer" containerID="05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6" Sep 30 19:27:42 crc kubenswrapper[4797]: E0930 19:27:42.709853 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6\": container with ID starting with 05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6 not found: ID does not exist" containerID="05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6" Sep 30 19:27:42 crc kubenswrapper[4797]: I0930 19:27:42.709881 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6"} err="failed to get container status \"05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6\": rpc error: code = NotFound desc = could not find container \"05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6\": container with ID starting with 05ba1137194c4db6aa3227aea78c84a41838f049557573a95811093cc0b807c6 not found: ID does not exist" Sep 30 19:27:44 crc kubenswrapper[4797]: I0930 19:27:44.192478 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:27:44 crc kubenswrapper[4797]: I0930 19:27:44.193110 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:27:44 crc kubenswrapper[4797]: I0930 19:27:44.253558 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395ac072-f000-4cab-b948-e7838f7387fe" path="/var/lib/kubelet/pods/395ac072-f000-4cab-b948-e7838f7387fe/volumes" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.490058 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-kr68l_9e16cc99-60f9-4551-a18e-17f9beeca400/kube-rbac-proxy/0.log" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.667690 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-kr68l_9e16cc99-60f9-4551-a18e-17f9beeca400/controller/0.log" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.695407 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.891916 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.900095 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.912307 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:27:47 crc kubenswrapper[4797]: I0930 19:27:47.928380 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.114240 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.143189 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.151388 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.211146 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.403721 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-metrics/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.407791 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/controller/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.416057 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-frr-files/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.416057 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/cp-reloader/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.619736 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/frr-metrics/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.629784 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/kube-rbac-proxy/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.693131 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/kube-rbac-proxy-frr/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.886111 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/reloader/0.log" Sep 30 19:27:48 crc kubenswrapper[4797]: I0930 19:27:48.963153 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-667c9_f491bbe4-c848-4384-a932-13d5242e5871/frr-k8s-webhook-server/0.log" Sep 30 19:27:49 crc kubenswrapper[4797]: I0930 19:27:49.078216 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cb9fbcf6-fj2qz_10a1e68f-3d10-4da9-82db-d1043c94bcd8/manager/0.log" Sep 30 19:27:49 crc kubenswrapper[4797]: I0930 19:27:49.238053 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f4f79dfb8-6s6zz_77965127-1121-47ad-96b5-34229a106e24/webhook-server/0.log" Sep 30 19:27:49 crc kubenswrapper[4797]: I0930 19:27:49.486265 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dsmgz_0acb5984-08fc-4f2f-95d1-e65ba209a2f6/kube-rbac-proxy/0.log" Sep 30 19:27:50 crc kubenswrapper[4797]: I0930 19:27:50.056019 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dsmgz_0acb5984-08fc-4f2f-95d1-e65ba209a2f6/speaker/0.log" Sep 30 19:27:50 crc kubenswrapper[4797]: I0930 19:27:50.450710 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hwndx_0e18ed12-faf4-42df-a3f1-97f3c090fa57/frr/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.163988 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/util/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.316705 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/util/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.332484 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/pull/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.389865 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/pull/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.540612 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/util/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.547553 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/extract/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.554122 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcjkg2z_90d265e8-7eec-4b51-8784-3ec3efab5526/pull/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.703655 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/util/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.879007 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/pull/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.887384 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/util/0.log" Sep 30 19:28:03 crc kubenswrapper[4797]: I0930 19:28:03.913920 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/pull/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.098593 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/pull/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.128394 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/extract/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.137162 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dlgh5l_55addd78-6666-44ec-9bb4-56a24edfbc41/util/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.300418 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-utilities/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.467514 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-content/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.475249 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-utilities/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.488509 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-content/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.656431 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-utilities/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.660539 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/extract-content/0.log" Sep 30 19:28:04 crc kubenswrapper[4797]: I0930 19:28:04.905143 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-utilities/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.157744 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-utilities/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.172672 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-content/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.196846 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-content/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.371652 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-utilities/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.378563 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8gwld_42c7f1c4-357b-4a48-b0b0-71088e564851/registry-server/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.408972 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/extract-content/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.591153 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/util/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.780890 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/util/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.797483 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/pull/0.log" Sep 30 19:28:05 crc kubenswrapper[4797]: I0930 19:28:05.960704 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/pull/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.063727 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/pull/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.134588 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/util/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.229355 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96tgt78_44d64586-8967-4e80-809d-e205470ca444/extract/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.373269 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7qh78_e5cc187a-c46a-4d5c-8e16-9f49d7b3c5b2/marketplace-operator/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.429495 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-utilities/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.564558 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8pxk9_b8e54563-9315-4e3e-9527-6d2849b83ee3/registry-server/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.652748 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-utilities/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.689626 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-content/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.691180 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-content/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.864247 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-utilities/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.928608 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/extract-content/0.log" Sep 30 19:28:06 crc kubenswrapper[4797]: I0930 19:28:06.947864 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-utilities/0.log" Sep 30 19:28:07 crc kubenswrapper[4797]: I0930 19:28:07.069817 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8k7hb_580f4a26-2d39-479e-8815-185e094f1469/registry-server/0.log" Sep 30 19:28:07 crc kubenswrapper[4797]: I0930 19:28:07.167503 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-utilities/0.log" Sep 30 19:28:07 crc kubenswrapper[4797]: I0930 19:28:07.170102 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-content/0.log" Sep 30 19:28:07 crc kubenswrapper[4797]: I0930 19:28:07.215776 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-content/0.log" Sep 30 19:28:07 crc kubenswrapper[4797]: I0930 19:28:07.384919 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-content/0.log" Sep 30 19:28:07 crc kubenswrapper[4797]: I0930 19:28:07.385056 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/extract-utilities/0.log" Sep 30 19:28:08 crc kubenswrapper[4797]: I0930 19:28:08.222000 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgrpv_1b07c611-8431-4db0-b22d-89f8e391c90f/registry-server/0.log" Sep 30 19:28:14 crc kubenswrapper[4797]: I0930 19:28:14.191881 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:28:14 crc kubenswrapper[4797]: I0930 19:28:14.192580 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:28:19 crc kubenswrapper[4797]: I0930 19:28:19.775362 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-4rpbl_e3897355-dd17-4112-94fd-42c45c4cfa7f/prometheus-operator/0.log" Sep 30 19:28:19 crc kubenswrapper[4797]: I0930 19:28:19.880646 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f9c5648db-9pbgh_d153021c-fc00-4c10-91dd-69c13423dd4d/prometheus-operator-admission-webhook/0.log" Sep 30 19:28:19 crc kubenswrapper[4797]: I0930 19:28:19.923925 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f9c5648db-bvsr4_9ea30b0e-2f00-44a1-8e46-cc36ffc843a5/prometheus-operator-admission-webhook/0.log" Sep 30 19:28:20 crc kubenswrapper[4797]: I0930 19:28:20.076975 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-nvvj2_f3d6451a-ed07-4fc4-9ebe-a8c8d514999c/operator/0.log" Sep 30 19:28:20 crc kubenswrapper[4797]: I0930 19:28:20.092222 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-r4wwh_91658bbd-3b03-40ce-af08-985444e42376/perses-operator/0.log" Sep 30 19:28:44 crc kubenswrapper[4797]: I0930 19:28:44.191703 4797 patch_prober.go:28] interesting pod/machine-config-daemon-b8bg9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:28:44 crc kubenswrapper[4797]: I0930 19:28:44.192293 4797 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:28:44 crc kubenswrapper[4797]: I0930 19:28:44.192356 4797 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" Sep 30 19:28:44 crc kubenswrapper[4797]: I0930 19:28:44.193408 4797 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6"} pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:28:44 crc kubenswrapper[4797]: I0930 19:28:44.193523 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerName="machine-config-daemon" containerID="cri-o://487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" gracePeriod=600 Sep 30 19:28:44 crc kubenswrapper[4797]: E0930 19:28:44.328147 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:28:45 crc kubenswrapper[4797]: I0930 19:28:45.232251 4797 generic.go:334] "Generic (PLEG): container finished" podID="ec455803-9758-4ad4-a627-ce3ad63812c2" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" exitCode=0 Sep 30 19:28:45 crc kubenswrapper[4797]: I0930 19:28:45.232317 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" event={"ID":"ec455803-9758-4ad4-a627-ce3ad63812c2","Type":"ContainerDied","Data":"487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6"} Sep 30 19:28:45 crc kubenswrapper[4797]: I0930 19:28:45.232363 4797 scope.go:117] "RemoveContainer" containerID="75bc536e3104e25f05f9c42c898aa9d45566d6737d743e239442ca221fb257b6" Sep 30 19:28:45 crc kubenswrapper[4797]: I0930 19:28:45.233335 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:28:45 crc kubenswrapper[4797]: E0930 19:28:45.233895 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:28:56 crc kubenswrapper[4797]: I0930 19:28:56.238738 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:28:56 crc kubenswrapper[4797]: E0930 19:28:56.239724 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:29:08 crc kubenswrapper[4797]: I0930 19:29:08.238877 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:29:08 crc kubenswrapper[4797]: E0930 19:29:08.239685 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:29:22 crc kubenswrapper[4797]: I0930 19:29:22.239723 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:29:22 crc kubenswrapper[4797]: E0930 19:29:22.240790 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:29:35 crc kubenswrapper[4797]: I0930 19:29:35.238614 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:29:35 crc kubenswrapper[4797]: E0930 19:29:35.239633 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.948983 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7vxx"] Sep 30 19:29:44 crc kubenswrapper[4797]: E0930 19:29:44.950645 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="extract-content" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.950664 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="extract-content" Sep 30 19:29:44 crc kubenswrapper[4797]: E0930 19:29:44.950717 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="extract-utilities" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.950724 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="extract-utilities" Sep 30 19:29:44 crc kubenswrapper[4797]: E0930 19:29:44.950770 4797 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="registry-server" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.950777 4797 state_mem.go:107] "Deleted CPUSet assignment" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="registry-server" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.951027 4797 memory_manager.go:354] "RemoveStaleState removing state" podUID="395ac072-f000-4cab-b948-e7838f7387fe" containerName="registry-server" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.953335 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.984260 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7vxx"] Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.986023 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r259c\" (UniqueName: \"kubernetes.io/projected/29c5be43-d73e-49a5-9aa2-b529f172661f-kube-api-access-r259c\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.986241 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-catalog-content\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:44 crc kubenswrapper[4797]: I0930 19:29:44.986349 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-utilities\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.089179 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r259c\" (UniqueName: \"kubernetes.io/projected/29c5be43-d73e-49a5-9aa2-b529f172661f-kube-api-access-r259c\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.089302 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-catalog-content\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.089373 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-utilities\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.089838 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-catalog-content\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.089933 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-utilities\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.111341 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r259c\" (UniqueName: \"kubernetes.io/projected/29c5be43-d73e-49a5-9aa2-b529f172661f-kube-api-access-r259c\") pod \"redhat-operators-g7vxx\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.288756 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:45 crc kubenswrapper[4797]: I0930 19:29:45.828874 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7vxx"] Sep 30 19:29:46 crc kubenswrapper[4797]: I0930 19:29:46.014265 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerStarted","Data":"288e6042bef3def87b16978431788c8ef67512eca04fd23208ecb1d109aa663d"} Sep 30 19:29:47 crc kubenswrapper[4797]: I0930 19:29:47.026385 4797 generic.go:334] "Generic (PLEG): container finished" podID="29c5be43-d73e-49a5-9aa2-b529f172661f" containerID="8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143" exitCode=0 Sep 30 19:29:47 crc kubenswrapper[4797]: I0930 19:29:47.026690 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerDied","Data":"8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143"} Sep 30 19:29:49 crc kubenswrapper[4797]: I0930 19:29:49.055687 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerStarted","Data":"d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b"} Sep 30 19:29:49 crc kubenswrapper[4797]: I0930 19:29:49.238219 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:29:49 crc kubenswrapper[4797]: E0930 19:29:49.238570 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:29:51 crc kubenswrapper[4797]: I0930 19:29:51.085014 4797 generic.go:334] "Generic (PLEG): container finished" podID="29c5be43-d73e-49a5-9aa2-b529f172661f" containerID="d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b" exitCode=0 Sep 30 19:29:51 crc kubenswrapper[4797]: I0930 19:29:51.085085 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerDied","Data":"d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b"} Sep 30 19:29:52 crc kubenswrapper[4797]: I0930 19:29:52.094931 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerStarted","Data":"79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70"} Sep 30 19:29:52 crc kubenswrapper[4797]: I0930 19:29:52.127254 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7vxx" podStartSLOduration=3.677484858 podStartE2EDuration="8.127226624s" podCreationTimestamp="2025-09-30 19:29:44 +0000 UTC" firstStartedPulling="2025-09-30 19:29:47.029529633 +0000 UTC m=+6437.552028881" lastFinishedPulling="2025-09-30 19:29:51.479271399 +0000 UTC m=+6442.001770647" observedRunningTime="2025-09-30 19:29:52.119674096 +0000 UTC m=+6442.642173334" watchObservedRunningTime="2025-09-30 19:29:52.127226624 +0000 UTC m=+6442.649725902" Sep 30 19:29:55 crc kubenswrapper[4797]: I0930 19:29:55.289300 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:55 crc kubenswrapper[4797]: I0930 19:29:55.289886 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:29:56 crc kubenswrapper[4797]: I0930 19:29:56.359487 4797 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7vxx" podUID="29c5be43-d73e-49a5-9aa2-b529f172661f" containerName="registry-server" probeResult="failure" output=< Sep 30 19:29:56 crc kubenswrapper[4797]: timeout: failed to connect service ":50051" within 1s Sep 30 19:29:56 crc kubenswrapper[4797]: > Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.176785 4797 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn"] Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.179970 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.183475 4797 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.184696 4797 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.187222 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn"] Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.314727 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr85f\" (UniqueName: \"kubernetes.io/projected/c723dc04-719a-44f1-bbe9-df1e19e80bb4-kube-api-access-hr85f\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.314952 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c723dc04-719a-44f1-bbe9-df1e19e80bb4-secret-volume\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.315133 4797 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c723dc04-719a-44f1-bbe9-df1e19e80bb4-config-volume\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.417631 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr85f\" (UniqueName: \"kubernetes.io/projected/c723dc04-719a-44f1-bbe9-df1e19e80bb4-kube-api-access-hr85f\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.418062 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c723dc04-719a-44f1-bbe9-df1e19e80bb4-secret-volume\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.418121 4797 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c723dc04-719a-44f1-bbe9-df1e19e80bb4-config-volume\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.419094 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c723dc04-719a-44f1-bbe9-df1e19e80bb4-config-volume\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.435576 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c723dc04-719a-44f1-bbe9-df1e19e80bb4-secret-volume\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.442307 4797 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr85f\" (UniqueName: \"kubernetes.io/projected/c723dc04-719a-44f1-bbe9-df1e19e80bb4-kube-api-access-hr85f\") pod \"collect-profiles-29321010-z5zvn\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.519655 4797 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:00 crc kubenswrapper[4797]: I0930 19:30:00.991571 4797 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn"] Sep 30 19:30:01 crc kubenswrapper[4797]: I0930 19:30:01.185697 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" event={"ID":"c723dc04-719a-44f1-bbe9-df1e19e80bb4","Type":"ContainerStarted","Data":"c726a0efb35ac004bab1c6d13010b98f069f3a94c84c89b5fbf239fdf9f258a9"} Sep 30 19:30:01 crc kubenswrapper[4797]: I0930 19:30:01.187458 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" event={"ID":"c723dc04-719a-44f1-bbe9-df1e19e80bb4","Type":"ContainerStarted","Data":"be9a3642d40f743f249bdcaba26f7e92e62ada7491a1157d8d9338035eae820a"} Sep 30 19:30:01 crc kubenswrapper[4797]: I0930 19:30:01.206513 4797 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" podStartSLOduration=1.2064946380000001 podStartE2EDuration="1.206494638s" podCreationTimestamp="2025-09-30 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:30:01.197681135 +0000 UTC m=+6451.720180443" watchObservedRunningTime="2025-09-30 19:30:01.206494638 +0000 UTC m=+6451.728993876" Sep 30 19:30:01 crc kubenswrapper[4797]: I0930 19:30:01.238605 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:30:01 crc kubenswrapper[4797]: E0930 19:30:01.239043 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:30:02 crc kubenswrapper[4797]: I0930 19:30:02.216309 4797 generic.go:334] "Generic (PLEG): container finished" podID="c723dc04-719a-44f1-bbe9-df1e19e80bb4" containerID="c726a0efb35ac004bab1c6d13010b98f069f3a94c84c89b5fbf239fdf9f258a9" exitCode=0 Sep 30 19:30:02 crc kubenswrapper[4797]: I0930 19:30:02.216544 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" event={"ID":"c723dc04-719a-44f1-bbe9-df1e19e80bb4","Type":"ContainerDied","Data":"c726a0efb35ac004bab1c6d13010b98f069f3a94c84c89b5fbf239fdf9f258a9"} Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.631653 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.791682 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c723dc04-719a-44f1-bbe9-df1e19e80bb4-secret-volume\") pod \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.791786 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c723dc04-719a-44f1-bbe9-df1e19e80bb4-config-volume\") pod \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.791833 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr85f\" (UniqueName: \"kubernetes.io/projected/c723dc04-719a-44f1-bbe9-df1e19e80bb4-kube-api-access-hr85f\") pod \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\" (UID: \"c723dc04-719a-44f1-bbe9-df1e19e80bb4\") " Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.792863 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c723dc04-719a-44f1-bbe9-df1e19e80bb4-config-volume" (OuterVolumeSpecName: "config-volume") pod "c723dc04-719a-44f1-bbe9-df1e19e80bb4" (UID: "c723dc04-719a-44f1-bbe9-df1e19e80bb4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.798674 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c723dc04-719a-44f1-bbe9-df1e19e80bb4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c723dc04-719a-44f1-bbe9-df1e19e80bb4" (UID: "c723dc04-719a-44f1-bbe9-df1e19e80bb4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.811024 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c723dc04-719a-44f1-bbe9-df1e19e80bb4-kube-api-access-hr85f" (OuterVolumeSpecName: "kube-api-access-hr85f") pod "c723dc04-719a-44f1-bbe9-df1e19e80bb4" (UID: "c723dc04-719a-44f1-bbe9-df1e19e80bb4"). InnerVolumeSpecName "kube-api-access-hr85f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.896834 4797 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c723dc04-719a-44f1-bbe9-df1e19e80bb4-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.897201 4797 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c723dc04-719a-44f1-bbe9-df1e19e80bb4-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:03 crc kubenswrapper[4797]: I0930 19:30:03.897219 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr85f\" (UniqueName: \"kubernetes.io/projected/c723dc04-719a-44f1-bbe9-df1e19e80bb4-kube-api-access-hr85f\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:04 crc kubenswrapper[4797]: I0930 19:30:04.269444 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" Sep 30 19:30:04 crc kubenswrapper[4797]: I0930 19:30:04.298883 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-z5zvn" event={"ID":"c723dc04-719a-44f1-bbe9-df1e19e80bb4","Type":"ContainerDied","Data":"be9a3642d40f743f249bdcaba26f7e92e62ada7491a1157d8d9338035eae820a"} Sep 30 19:30:04 crc kubenswrapper[4797]: I0930 19:30:04.299392 4797 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be9a3642d40f743f249bdcaba26f7e92e62ada7491a1157d8d9338035eae820a" Sep 30 19:30:04 crc kubenswrapper[4797]: I0930 19:30:04.299636 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc"] Sep 30 19:30:04 crc kubenswrapper[4797]: I0930 19:30:04.307442 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-6kchc"] Sep 30 19:30:05 crc kubenswrapper[4797]: I0930 19:30:05.337681 4797 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:30:05 crc kubenswrapper[4797]: I0930 19:30:05.389075 4797 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:30:05 crc kubenswrapper[4797]: I0930 19:30:05.596164 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7vxx"] Sep 30 19:30:06 crc kubenswrapper[4797]: I0930 19:30:06.251266 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e54865-5d25-47e2-ae20-0db2c3c3a44b" path="/var/lib/kubelet/pods/40e54865-5d25-47e2-ae20-0db2c3c3a44b/volumes" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.301476 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7vxx" podUID="29c5be43-d73e-49a5-9aa2-b529f172661f" containerName="registry-server" containerID="cri-o://79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70" gracePeriod=2 Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.839302 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.889546 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-utilities\") pod \"29c5be43-d73e-49a5-9aa2-b529f172661f\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.889704 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r259c\" (UniqueName: \"kubernetes.io/projected/29c5be43-d73e-49a5-9aa2-b529f172661f-kube-api-access-r259c\") pod \"29c5be43-d73e-49a5-9aa2-b529f172661f\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.889959 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-catalog-content\") pod \"29c5be43-d73e-49a5-9aa2-b529f172661f\" (UID: \"29c5be43-d73e-49a5-9aa2-b529f172661f\") " Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.891283 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-utilities" (OuterVolumeSpecName: "utilities") pod "29c5be43-d73e-49a5-9aa2-b529f172661f" (UID: "29c5be43-d73e-49a5-9aa2-b529f172661f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.915411 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c5be43-d73e-49a5-9aa2-b529f172661f-kube-api-access-r259c" (OuterVolumeSpecName: "kube-api-access-r259c") pod "29c5be43-d73e-49a5-9aa2-b529f172661f" (UID: "29c5be43-d73e-49a5-9aa2-b529f172661f"). InnerVolumeSpecName "kube-api-access-r259c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.983135 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29c5be43-d73e-49a5-9aa2-b529f172661f" (UID: "29c5be43-d73e-49a5-9aa2-b529f172661f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.992259 4797 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.992297 4797 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c5be43-d73e-49a5-9aa2-b529f172661f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:07 crc kubenswrapper[4797]: I0930 19:30:07.992308 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r259c\" (UniqueName: \"kubernetes.io/projected/29c5be43-d73e-49a5-9aa2-b529f172661f-kube-api-access-r259c\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.315868 4797 generic.go:334] "Generic (PLEG): container finished" podID="29c5be43-d73e-49a5-9aa2-b529f172661f" containerID="79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70" exitCode=0 Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.315920 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerDied","Data":"79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70"} Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.315945 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7vxx" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.315975 4797 scope.go:117] "RemoveContainer" containerID="79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.315958 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7vxx" event={"ID":"29c5be43-d73e-49a5-9aa2-b529f172661f","Type":"ContainerDied","Data":"288e6042bef3def87b16978431788c8ef67512eca04fd23208ecb1d109aa663d"} Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.348119 4797 scope.go:117] "RemoveContainer" containerID="d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.351744 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7vxx"] Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.363257 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7vxx"] Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.371856 4797 scope.go:117] "RemoveContainer" containerID="8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.452842 4797 scope.go:117] "RemoveContainer" containerID="79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70" Sep 30 19:30:08 crc kubenswrapper[4797]: E0930 19:30:08.453424 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70\": container with ID starting with 79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70 not found: ID does not exist" containerID="79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.453552 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70"} err="failed to get container status \"79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70\": rpc error: code = NotFound desc = could not find container \"79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70\": container with ID starting with 79f8206aa38876fc32e95a5803375bd356c5d10c8c92599c60f1a4cbc2871f70 not found: ID does not exist" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.453646 4797 scope.go:117] "RemoveContainer" containerID="d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b" Sep 30 19:30:08 crc kubenswrapper[4797]: E0930 19:30:08.454192 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b\": container with ID starting with d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b not found: ID does not exist" containerID="d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.454230 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b"} err="failed to get container status \"d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b\": rpc error: code = NotFound desc = could not find container \"d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b\": container with ID starting with d19a2d13577e115c0b5061c72c62a3563ce3233a7b45bfe34ea66e52b4f4cb2b not found: ID does not exist" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.454258 4797 scope.go:117] "RemoveContainer" containerID="8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143" Sep 30 19:30:08 crc kubenswrapper[4797]: E0930 19:30:08.454795 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143\": container with ID starting with 8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143 not found: ID does not exist" containerID="8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143" Sep 30 19:30:08 crc kubenswrapper[4797]: I0930 19:30:08.454917 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143"} err="failed to get container status \"8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143\": rpc error: code = NotFound desc = could not find container \"8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143\": container with ID starting with 8497a1abc28aaaabc38e326af3c60880570797fe309510301b0f2787617d8143 not found: ID does not exist" Sep 30 19:30:10 crc kubenswrapper[4797]: I0930 19:30:10.255286 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c5be43-d73e-49a5-9aa2-b529f172661f" path="/var/lib/kubelet/pods/29c5be43-d73e-49a5-9aa2-b529f172661f/volumes" Sep 30 19:30:12 crc kubenswrapper[4797]: I0930 19:30:12.239513 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:30:12 crc kubenswrapper[4797]: E0930 19:30:12.240787 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:30:19 crc kubenswrapper[4797]: I0930 19:30:19.102162 4797 scope.go:117] "RemoveContainer" containerID="21f2908886b56ef00ffb32b1f0cba4efb4c80e2ba85bd670e2b6ef999539c590" Sep 30 19:30:19 crc kubenswrapper[4797]: I0930 19:30:19.135494 4797 scope.go:117] "RemoveContainer" containerID="4061ebdca6566b378517452e2a94ba2954e63cf30ed7665a63605af65acb90d0" Sep 30 19:30:23 crc kubenswrapper[4797]: I0930 19:30:23.238544 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:30:23 crc kubenswrapper[4797]: E0930 19:30:23.239390 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:30:35 crc kubenswrapper[4797]: I0930 19:30:35.239030 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:30:35 crc kubenswrapper[4797]: E0930 19:30:35.239709 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:30:38 crc kubenswrapper[4797]: I0930 19:30:38.676740 4797 generic.go:334] "Generic (PLEG): container finished" podID="b796179f-ac01-4663-b13f-ba4506fa30ec" containerID="3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db" exitCode=0 Sep 30 19:30:38 crc kubenswrapper[4797]: I0930 19:30:38.677203 4797 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5hqg/must-gather-z28w4" event={"ID":"b796179f-ac01-4663-b13f-ba4506fa30ec","Type":"ContainerDied","Data":"3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db"} Sep 30 19:30:38 crc kubenswrapper[4797]: I0930 19:30:38.681540 4797 scope.go:117] "RemoveContainer" containerID="3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db" Sep 30 19:30:38 crc kubenswrapper[4797]: E0930 19:30:38.687301 4797 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb796179f_ac01_4663_b13f_ba4506fa30ec.slice/crio-conmon-3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:30:39 crc kubenswrapper[4797]: I0930 19:30:39.565945 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c5hqg_must-gather-z28w4_b796179f-ac01-4663-b13f-ba4506fa30ec/gather/0.log" Sep 30 19:30:49 crc kubenswrapper[4797]: I0930 19:30:49.238670 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:30:49 crc kubenswrapper[4797]: E0930 19:30:49.239775 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.266731 4797 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5hqg/must-gather-z28w4"] Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.267542 4797 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-c5hqg/must-gather-z28w4" podUID="b796179f-ac01-4663-b13f-ba4506fa30ec" containerName="copy" containerID="cri-o://4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c" gracePeriod=2 Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.279865 4797 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5hqg/must-gather-z28w4"] Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.747075 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c5hqg_must-gather-z28w4_b796179f-ac01-4663-b13f-ba4506fa30ec/copy/0.log" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.747655 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.872261 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfbn8\" (UniqueName: \"kubernetes.io/projected/b796179f-ac01-4663-b13f-ba4506fa30ec-kube-api-access-kfbn8\") pod \"b796179f-ac01-4663-b13f-ba4506fa30ec\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.872346 4797 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b796179f-ac01-4663-b13f-ba4506fa30ec-must-gather-output\") pod \"b796179f-ac01-4663-b13f-ba4506fa30ec\" (UID: \"b796179f-ac01-4663-b13f-ba4506fa30ec\") " Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.894207 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b796179f-ac01-4663-b13f-ba4506fa30ec-kube-api-access-kfbn8" (OuterVolumeSpecName: "kube-api-access-kfbn8") pod "b796179f-ac01-4663-b13f-ba4506fa30ec" (UID: "b796179f-ac01-4663-b13f-ba4506fa30ec"). InnerVolumeSpecName "kube-api-access-kfbn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.930238 4797 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c5hqg_must-gather-z28w4_b796179f-ac01-4663-b13f-ba4506fa30ec/copy/0.log" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.930718 4797 generic.go:334] "Generic (PLEG): container finished" podID="b796179f-ac01-4663-b13f-ba4506fa30ec" containerID="4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c" exitCode=143 Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.930766 4797 scope.go:117] "RemoveContainer" containerID="4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.930889 4797 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5hqg/must-gather-z28w4" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.953846 4797 scope.go:117] "RemoveContainer" containerID="3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db" Sep 30 19:30:53 crc kubenswrapper[4797]: I0930 19:30:53.974196 4797 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfbn8\" (UniqueName: \"kubernetes.io/projected/b796179f-ac01-4663-b13f-ba4506fa30ec-kube-api-access-kfbn8\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.035609 4797 scope.go:117] "RemoveContainer" containerID="4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c" Sep 30 19:30:54 crc kubenswrapper[4797]: E0930 19:30:54.036197 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c\": container with ID starting with 4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c not found: ID does not exist" containerID="4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.036242 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c"} err="failed to get container status \"4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c\": rpc error: code = NotFound desc = could not find container \"4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c\": container with ID starting with 4455b94d05978b6458119461eece77b447e8979cd43283628a0e309e41bad16c not found: ID does not exist" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.036269 4797 scope.go:117] "RemoveContainer" containerID="3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db" Sep 30 19:30:54 crc kubenswrapper[4797]: E0930 19:30:54.036845 4797 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db\": container with ID starting with 3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db not found: ID does not exist" containerID="3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.036885 4797 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db"} err="failed to get container status \"3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db\": rpc error: code = NotFound desc = could not find container \"3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db\": container with ID starting with 3114570775f1072e41e8832089e56197d53081f452dfab98431094a766c0c6db not found: ID does not exist" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.087087 4797 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b796179f-ac01-4663-b13f-ba4506fa30ec-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b796179f-ac01-4663-b13f-ba4506fa30ec" (UID: "b796179f-ac01-4663-b13f-ba4506fa30ec"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.178929 4797 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b796179f-ac01-4663-b13f-ba4506fa30ec-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:54 crc kubenswrapper[4797]: I0930 19:30:54.257595 4797 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b796179f-ac01-4663-b13f-ba4506fa30ec" path="/var/lib/kubelet/pods/b796179f-ac01-4663-b13f-ba4506fa30ec/volumes" Sep 30 19:31:03 crc kubenswrapper[4797]: I0930 19:31:03.238959 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:31:03 crc kubenswrapper[4797]: E0930 19:31:03.240645 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:31:16 crc kubenswrapper[4797]: I0930 19:31:16.239596 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:31:16 crc kubenswrapper[4797]: E0930 19:31:16.241097 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:31:31 crc kubenswrapper[4797]: I0930 19:31:31.238921 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:31:31 crc kubenswrapper[4797]: E0930 19:31:31.240188 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:31:46 crc kubenswrapper[4797]: I0930 19:31:46.238778 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:31:46 crc kubenswrapper[4797]: E0930 19:31:46.239704 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:31:58 crc kubenswrapper[4797]: I0930 19:31:58.238559 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:31:58 crc kubenswrapper[4797]: E0930 19:31:58.239374 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:32:11 crc kubenswrapper[4797]: I0930 19:32:11.239119 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:32:11 crc kubenswrapper[4797]: E0930 19:32:11.240083 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2" Sep 30 19:32:26 crc kubenswrapper[4797]: I0930 19:32:26.238786 4797 scope.go:117] "RemoveContainer" containerID="487f8d1e18bf0e0c470121eb55e25d1e27bc13679bf62c0779a6fcca488e7bb6" Sep 30 19:32:26 crc kubenswrapper[4797]: E0930 19:32:26.240914 4797 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b8bg9_openshift-machine-config-operator(ec455803-9758-4ad4-a627-ce3ad63812c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-b8bg9" podUID="ec455803-9758-4ad4-a627-ce3ad63812c2"